Anonymity and the Secure Web

Ben Klemens has written an essay criticising Mozilla’s moves towards an HTTPS web. In particular, he is worried about the difficulty of setting up an HTTPS website and the fact that (as he sees it) getting a certificate requires the disclosure of personal information. There were some misunderstandings in his analysis, so I wanted to add a comment to clarify what we are actually planning to do, and how we are going to meet his concerns.

However, he wrote it on Medium. Medium does not have its own login system; it only permits federated login using Twitter or Facebook. Here’s the personal information I would have to give away to Medium (and the powers I would have to give it) in order to comment on his essay about the problems Mozilla are supposedly causing by requiring people to give away personal information:

twitter

Don’t like that? That’s OK, I could use Facebook login, if I was willing to give away:

facebook

So I’ll have to comment here and hope he sees it. (Anyone who has decided the tradeoffs on Medium are worth it could perhaps post the URL in a comment for me.)

The primary solution to his issues is Let’s Encrypt. With Let’s Encrypt, you will be able to get a cert, which works in 99%+ of browsers anyone uses, without needing to supply any personal information or to pay, and all at the effort of running a single command on the command line. That is, the command line of the machine (or VM) that you have rented from the service provider and to whom you gave your credit card details and make a monthly payment to put up your DIY site. That machine. And the cert will be for the domain name that you pay your registrar a yearly fee for, and to whom you have also provided your personal information. That domain name.

If you have a source of free, no-information-required server hosting and free, no-information-required domain names (as Ben happens to for his Caltech Divinity School example), then it’s reasonable to say that you are a little inconvenienced if your HTTPS certificate is not also free and no-information-required. But most people doing homebrew DIY websites aren’t in that position – they have to rent such things. Once Let’s Encrypt is up and running, the situation with certificates will actually be easier and more anonymous than that with servers or domain names.

“Browsers no longer supporting HTTP” may well never happen, and it’s a long way off if it does. But insofar as the changes we do make are some small infringement on your right to build an insecure website, see it as a civic requirement, like passing a driving test. This is a barrier to someone just getting in a car and driving, but most would suggest it’s reasonable given the wider benefit to society of training those in control of potentially dangerous technology. Given the Great Cannon and similar technologies, which can repurpose accesses to any website as a DDOS tool, there are no websites which “don’t need to be secure”.

11 thoughts on “Anonymity and the Secure Web

  1. Regardless of the pros and cons of Mozilla’s choice to deprecate non-HTTPS sites, Mozilla screwed up immensely by letting this news get out before Let’s Encrypt is actually up and running. Very poor public relations.

  2. What do you mean by “letting this news get out”? For a start, it was not “let out” – we announced it. And there is, as yet, no timetable for the changes we plan to make. So it seems a bit odd to criticise the timing of any change when there is no timing.

    Clearly, the ease of availability of certificates is one factor in our decisions about what to do, when. But some of the things we want to do require a lead time and warning, and it seems reasonable to start that process before LE is going.

    (Note that I am not the final decision-maker here.)

  3. But this is the crux of the problem here:

    “But insofar as the changes we do make are some small infringement on your right …”

    I’ve seen a lot of thoughtful comments about s/http/https/g, but I’ve also seen a lot of this kind of trivializing commentary — along the lines of “Oh, stop being so worried. Trust us, we’ll work everything out just fine.”

    The goal of the policies under discussion (and they are *policies*, not implementation issues) is to remove the very possibility of a plain text alternative/backup to https. Even if that turns out to be difficult or in some very far off distance, that *is* the goal. (And it may happen quickly or slowly — who knows.) Understandably, a lot of people (myself included) are scared witless that there’s some small chance that might be devastatingly bad for the web, and then the only recourse available will be arduous and maybe even Sisyphean (like trying to unscramble the surveillance omelette or fighting back against the centralisation around the services you mention in your post).

    I’m slowly being assured that is not the case with s/http/https/g, but there are still lots of situations that haven’t been addressed well. Some of those affect me directly (as I write web-based software that I give to people who run them like programs), so it’s very hard from my perspective to view these policies as “some small infringement”. Nonetheless, I *can* see a way forward on those issues.

    The privacy issues are still a more general concern, though — and the argument that things are already bad doesn’t cut it. Let’s Encrypt will work well, if it works as intended. But if Charlie is guaranteeing that *any* information that passes between Alice and Bob will be unobservable by anyone else, surely we want “anyone else” to include Charlie. But what if Charlie ends up compromised? Then his guarantee is worth nothing. And now we have the problem that Charlie gave the guarantee to the entire Internet.

  4. While the previous comment could certainly be worded better, I do feel like there’s a bit of putting the cart before the horse here. I do think moving things to SSL, and failing that, HTTPOE, is a good idea and we should move in that direction. However, I feel like announcing all the negative aspects before having the mitigation plans (Let’s Encrypt) in place was a bad idea; that means people start panicking without being able to _do_ anything about it, which makes the whole thing a much more negative experience. The whole thing would probably go better if it went mitigation -> announcement -> enforcement, instead.

    This wasn’t the only instance, too; the addon signing requirements recently went basically the same way. Announcement of all the punishment has occurred, but people can’t actually do anything about it yet. Except in that case mitigation seemed more “if we get around to it”, which is even scarier…

    Contrast against, to pick randomly, the announcement of code.google.com closing. They announced it together with a tool to migrate to github. Still sucks that it will happen, but there’s something the affected people can _do_ about it (effective or not being besides the point).

  5. I wonder if the lets-encrypt package works with a node web server that isn’t configured to serve static files. Looking at https://letsencrypt.org/howitworks/ it’s not clear it does.

    Supposing it does, I also wonder what we can do to make setting up a local test server easier. Generating a self-signed certificate and adding it to the certificate store of each device/browser/app you want to test with is a real hassle. Is there an easier way?

  6. I think it’s enormously premature to object to moving more towards HTTPS because one possible outcome of many, years down the line, might be enormously bad for the web. Mozilla is dedicated to preserving and protecting the open web; do you really think we would keep pushing in this direction if the massive negative consequences you invoke turned out to be looming on the horizon? HTTP is unlikely to ever go away entirely; what the HTTP experience looks like in web browsers 10 years from now depends on a whole load of factors, including how successful we are in moving everyone to HTTPS. Which itself depends on how easy we can make it.

    I think that if Let’s Encrypt became the world’s most popular CA, that would be a good problem to have. :-) The issue in your last paragraph is best fixed by things like a full implementation of Certificate Transparency, not by avoiding HTTPS altogether.

  7. There’s probably a downside I’m not seeing, but what if we simply accepted self-signed certs for localhost only, without warning? After all, if a service is running on the local machine, it’s going to be a trusted service. Perhaps only for ports < 1024.

  8. But the announcement was the start of a discussion which will help us understand the potential problems and develop appropriate mitigations. You can’t mitigate what you don’t know about, and Mozilla is not arrogant enough to assume that we know all of the possible problems this move will cause.

    If we can get past “the sky is falling!” and move on to a sober assessment of the possible problems and how they can be addressed, that would be much more constructive. I think we are past the “whether” stage for “more HTTPS”; let’s have a good discussion about “how”.

  9. The announcement starts with “Today we are announcing our intent to phase out non-secure HTTP.” It ends with “Thanks to the many people who participated in the mailing list discussion of this proposal. Let’s get the web secured!” That’s not a discussion; that’s a statement. It’s nonsensical to _announce_ discussions; that’s presenting a done decision that isn’t going to be revisited.

    If the post was actually framed as a discussion, I don’t think people would have that much backlash. As it is right now, it’s unclear where a good discussion is taking place. There’s a giant mailing list thread, but no place tracking issues brought up and proposed solutions as far as I can tell, so it’s difficult for people to update themselves on current thinking.

    And none of that matters anyway; I believe most people agree that, in the abstract, more HTTPS is better. They just believe that in their specific circumstances, they won’t be able to transition before the (complete undefined duration) deprecation period is over. That’s why I thought having the mitigation set up before the announcement would have worked better. If people could see ways they could avoid the issues over the medium term, it wouldn’t be necessary to create the backlash.

  10. It’s part of a process. We had a mailing list discussion, agreed in principle that this was the way forward, and announced it (the title says “Deprecating”, not “Eliminating”) – and we will be having further discussions in future. I think the direction of travel (i.e. the principle) is a done decision, and it’s done for the five or six major Internet governance entities (IAB, W3C, etc.) who have also announced similar plans or efforts. But the distance we travel is to be decided as we go along.

    It would be foolish to say now “we are definitely going to remove HTTP entirely from Firefox” and then refuse to change that plan over the next ten years whatever the evidence showed. And that’s why we aren’t saying that, and we won’t do that.

    I take your point about tracking issues and proposed solutions, although I think that given the way this is going to be done, those are best attached to individual proposals for feature restrictions.

Leave a Reply

Your email address will not be published. Required fields are marked *