Is there any reason not to enforce HTTPS on a website?
Solution 1:
In this day and age, TLS + HSTS are markers that your site is managed by professionals who can be trusted to know what they're doing. That is an emerging minimum-standard for trustability, as evidenced by Google stating they'll provide positive ranking for sites that do so.
On the other end is maximum compatibility. There are still older clients out there, especially in parts of the world that aren't the United States, Europe, or China. Plain HTTP will always work (though, not always work well; that's another story).
TLS + HSTS: Optimize for search-engine ranking
Plain HTTP: Optimize for compatibility
Depends on what matters more for you.
Solution 2:
There is one good reason for simple read only websites not to use HTTPS.
- Web caches can't cache images that are transported over HTTPS.
- Some parts of the world have very low-speed international connections, so depend on the caches.
- Hosting images from another domain takes skills that you can’t expect the operators for small read only websites to have.
Solution 3:
The maintainer claims that TLS must be optional. Why?
To truly know the answer to this question, you must ask them. We can, however, make some guesses.
In corporate environments, it's common for IT to install a firewall that inspects traffic incoming and outgoing for malware, suspicious CnC-like activity, content deemed inappropriate for work (e.g. pornography), etc. This becomes much harder when the traffic is encrypted. There are essentially three possible responses:
- Give up on monitoring this traffic.
- Install a root CA on users' machines so you can perform MitM decryption and inspection.
- Wholesale block encrypted traffic.
For a concerned sysadmin, none of these options are particularly appealing. There are a great many threats that attack a corporate network, and it is their job to protect the company against them. However, blocking a great many sites entirely raises the ire of users, and installing a root CA can feel a bit scummy, as it introduces privacy and security considerations for users. I remember seeing (sorry, can't find the thread) a sysadmin petition reddit when they were first turning on HSTS because he was in exactly this situation, and didn't want to block all of reddit simply because he was compelled by the business to block the porn-focused subreddits.
The wheels of technology keep churning ahead, and you'll find many who argue that this sort of protection is old-fashioned and should be phased out. But there are still many who practice it, and perhaps it is them with whom your mysterious maintainer is concerned.
Solution 4:
There are several good reasons to use TLS
(and only few marginal reasons not to do so).
- If the site has any authentication, using HTTP expose for stealing sessions and passwords.
Even on static, merely informational sites, using TLS ensures no-one has tampered with the data.
Since Google I/O 2014, Google has taken several steps to encourage all sites to use HTTPS:
- Google have been helping webmasters to configure their servers more secure, but also used HTTPS as a ranking signal.
- More recently, Google Chrome has started marking HTTP sites as non-secure, as part of a long-term plan to mark all HTTP sites as non-secure.
- Google Chrome Developers team's Mythbusting HTTPS lecture states their attitude clearly.
The Mozilla Security Blog has also announced of Deprecating Non-Secure HTTP by making all new features available only to secure websites and gradually phasing out access to browser features for non-secure websites, especially features that pose risks to users’ security and privacy.
There are also several good reasons to enforce TLS
If you already have a widely trusted certificate, why not always use it? Practically all current browsers supports TLS and has root certificates installed. The only compatibility problem I've actually seen in years have been Android devices and Missing intermediate certificate authority as Android only trusts root CAs directly. This can easily be prevented by configuring the server to send the chain of certificates back to the root CA.
If your maintainer still would like to allow HTTP connections without direct 301 Moved Permanently
, say for ensuring access from some really old browsers or mobile devices, there is no way for the browser to know that you even have HTTPS configured. Furthermore, you shouldn't deploy HTTP Strict Transport Security (HSTS) without 301 Moved Permanently
:
7.2. HTTP Request Type If an HSTS Host receives a HTTP request message over a non-secure transport, it SHOULD send a HTTP response message containing a status code indicating a permanent redirect, such as status code 301 (Section 10.3.2 of [RFC2616]), and a Location header field value containing either the HTTP request's original Effective Request URI (see Section 9 "Constructing an Effective Request URI") altered as necessary to have a URI scheme of "https", or a URI generated according to local policy with a URI scheme of "https").
The problem of various sites configured for both protocols is recognized by The Tor Project and the Electronic Frontier Foundation and addressed by a multibrowser HTTPS Everywhere extension:
Many sites on the web offer some limited support for encryption over HTTPS, but make it difficult to use. For instance, they may default to unencrypted HTTP, or fill encrypted pages with links that go back to the unencrypted site.
Mixed content was also a huge problem due to possible XSS attacks to HTTPS sites through modifying JavaScript or CSS loaded via non-secure HTTP connection. Therefore nowadays all mainstream browsers warn users about pages with mixed content and refuses to automatically load it. This makes it hard to maintain a site without the 301
redirects on HTTP: you must ensure that every HTTP page only loads HTTP contect (CSS, JS, images etc.) and every HTTPS page only loads HTTPS content. That's extremely hard to achieve with the same content on both.
Solution 5:
It all comes down to your security requirements, user choice, and risk of implicit downgrading. Disabling old ciphers server-side is largely necessary because browsers will happily fall through to absolutely horrible ciphers client-side in the name of user experience/convenience. Making sure nothing of yours that depends on a secure channel to the user cannot be reached with an insecure method is, of course, also very sound.
Not allowing me to to explicitly downgrade to insecure HTTP when I've deemed that your blog post about why you like Python more than Ruby (not saying you do, just a generic example) isn't something I mind the spooks or the public knowing I accessed is just getting in my way for no good reason, on the assumption that HTTPS will be trivial for me.
There are, today, embedded systems which don't have the ability to use TLS out of the box, or ones which are stuck on old implementations (I think it's awfully bad that this is so, but as a power user of [insert embedded device here], I sometimes can't change this).
Here's a fun experiment: try downloading a recent version of LibreSSL from the upstream OpenBSD site over HTTPS with a sufficiently old TLS/SSL implementation. You won't be able to. I tried the other day on a device with an older OpenSSL build from 2012 or so, because I wanted to upgrade this embedded system to more secure, new stuff from source - I don't have the luxury of a prebuilt package. The error messages when I tried weren't exactly intuitive, but I presume it was because my older OpenSSL didn't support the right stuff.
This is one example where the move the only-HTTPS can actually detriment people: if you don't have the luxury of recent pre-built packages and want to fix the problem yourself by building from source, you're locked out. Thankfully, in the LibreSSL case, you can fall back to explicitly requesting HTTP. Sure, this won't save you from an attacker already rewriting your traffic, capable of replacing source packages with compromised versions and rewriting all checksums in HTTP bodies describing the packages available for download on the webpages you browse, but it's still useful in the much more common case.
Most of us aren't one unsecured download away from being owned by an APT (Advanced Persistent Thread: security jargon for national intelligence agencies and other extremely well-resourced cyber threats). Sometimes I just want to wget
some plain text documentation or a small program whose source I can quickly audit (my own tiny utilities/scripts on GitHub, for example) onto a box that doesn't support the most recent cipher suites.
Personally, I'd ask this: is your content such that a person could legitimately decide "I'm okay with me accessing being public knowledge"? Is there a plausible chance of real risk to non-technical people accidentally downgrading to HTTP for your content? Weight your security requirements, enforced-privacy-for-your-users requirements, and risk of implicit downgrades against the ability of users who understand the risks making an informed choice on a case-by-case basis to go unsecured. It's entirely legitimate to say that for your site, there's no good reason to not enforce HTTPS - but I think it's fair to say that there are still good use-cases for plain HTTP out there.