Those of us who help create and maintain “the internet” that everyone benefits from are now tasked with helping the world recover with one of the biggest, if not the biggest, security holes in the history of the internet.
To be certain they aren’t vulnerable, users need to change their passwords at every site that was at any point vulnerable to a Heartbleed attack. But a site has to be patched, and its SSL certificate has to be reissued with a newly generated secret key, before its password should be changed; otherwise, the new password is just as vulnerable to Heartbleed as the old one was. What’s more, you can’t just look at the start date of an SSL certificate to determine whether it was reissued, because that doesn’t tell you whether the site was patched before the certificate was deployed, and worse than that, some CAs (e.g., Digicert) quite reasonably re-key certificates without changing their original start dates.
I have passwords at over 500 sites. I’m sure there are people who use many more sites than that. Manually figuring out which sites need their passwords changed, and when to change them, and keeping track of which ones have been changed, is an impossible task.
What we need is a standard, widely adopted way for web sites to indicate, in a way that can be easily interpreted by software, whether they were ever vulnerable to Heartbleed, and if so, when the vulnerability was patched. Then browsers and password keepers such as LastPass can easily determine and track which user passwords need to be changed, and warn the user.
I’m hoping that someone else with far more clout in the browser development and/or internet security community has already thought of this idea and started the ball rolling on making it happen. But just in case that hasn’t happened, here’s my proposed standard:
- Web sites conforming to this standard shall serve the URL “/heartbleed.txt”.
- The file can have LF, CRLF, or CR line endings.
- Blank lines and lines starting with optional whitespace and then “#” are ignored.
- The file must have a line which reads (with optional whitespace between any of the lexical tokens) “vulnerable: 1” or “vulnerable: 0” to indicate whether the site was ever vulnerable to Heartbleed.
- The file must have a line which reads (with optional whitespace between any of the lexical tokens) “patched: 1” or “patched: 0” to indicate whether the vulnerability has been fully patched.
- The file must have a line which reads (with optional whitespace between any of the lexical tokens)” “patched_at: iso-8601-date-time“, where iso-8601-date-time is a valie ISO 8601 date and time in UTC (only UTC is acceptable to make the file as easy and reliable as possible for consumers of the file to parse).
- Any other lines in the file are ignored, to allow for future, backward-compatible enhancements to this standard (not that I anticipate any will be necessary, but of course famous last words and all that).
If someone else has already thought of this idea, please point me to it on-line and I’ll deprecate my own proposal and put a link to theirs here instead. The last thing we want is multiple proposals floating around for solving this problem.
UPDATE:
One could make an argument for generalizing this, e.g., calling the file “vulnerabilities.txt”, allowing multiple vulnerabilities to be listed in it with the vital stats for each of them, etc. I’m not sure whether we should do this now, at the expense of making it take longer for this idea to be adopted, or postpone it for later when we aren’t in the-internet-is-burning mode. If we were going to generalize it, then I would probably say that the URL should be /vulnerabilities.txt, it should be formatted as YAML, and it should have top-level keys corresponding to CVE identifiers and for each such identifier the “vulnerable”, “patched”, and “patched_at” fields as subkeys.
Pingback: News – April 13, 2014 | cipherpal
I suspect EFF’s “SSL Observatory” data could prove very useful here – perhaps you might want to talk to them.
Awesome idea. I’ll reach out to them.
Might talk to the folks behind http://perspectives-project.org/ as well.
This is the first I’ve heard about this – thanks. (Admittedly I don’t hang out in tech-y circles.) What should a layperson be doing at this point?
There isn’t really a great answer to that, which is why I posted this idea.
First and foremost, you should be using a password manager to store all of your passwords. In my opinion, this is no longer optional; anyone who tries to remember all of their passwords without the assistance of a password manager is using weak passwords, using the same or similar passwords on multiple sites, and/or writing passwords down somewhere. All of these are a very bad idea. Personally, I use and highly recommend LastPass. You can use it for free if you only want to use it in a web browser, or pay for the service if you want to also use it on phones and tables, which I personally think is well worth the money.
Once you start using a password manager, each time you visit a site that isn’t in the password manager yet, you should change your password on that site to something long and random that is stored in the password manager. That way you’ll gradually improve your security by having unique, strong passwords for all of the sites you use.
As for Heartbleed in particular, it’s difficult to know how to deal with it since even if you change your password to something long and random, if the site is vulnerable to Heartbleed and isn’t patched yet, then you’re still at risk of having your password and/or account there being compromised. There’s no great fix for that, hence this proposal.
Important update!
As of today, the LastPass Security Check you can run on the sites in your LastPass vault will tell you if any of the sites for which you have passwords were affected by the Heartbleed bug, and whether they’ve been patched, and whether you should change your password.
If you’re already a LastPass user, go to your vault and run the Security Check and do what it says, and make a habit of checking it periodically over the next few weeks as more sites are patched and/or found to be vulnerable.
If you’re not already a LastPass user, now would be a good time to consider becoming one!
This is a great idea, but wouldn’t we be giving attackers a far easier method for determining website vulnerabilities?
ie. If a site has a vuln.txt file that doesn’t have a listing for the latest CVE, one can assume that CVE has not been patched.
And there would have to be some method for verifying that sites are actually patching what they say, not just updating the file right away and then patching when more convenient.
Attackers already have tools to quickly scan sites for all known vulnerabilities. Making it easier for the good guys to know what has been patched and what hasn’t isn’t going to make things any easier for the bad guys.
There’s not much you can do about a web site operator that lies to you about whether their site has been patched. They have incentives not to lie about things like this, because (a) if they lie and then user data are subsequently compromised as a result of the lie, it will come out and it will make them look very bad, and (b) telling the truth increases the security of their users’ data.
You haven’t addressed digitally signing this file. If a site has been compromised, what’s to stop the bad guys from claiming that it has been patched? (Yes, I know that heartbleed itself doesn’t leave a trace, and this would, but still…)
I admit that the vulnerability you mention exists, but I don’t think that’s the major threat we need to be worried about here, and I would be concerned that adding a digital signature requirement, both on the web site maintainers to generate them properly and on the browser / app developers to validate them properly, would hamper quick adoption of the idea and thereby render it essentially moot.
Unless this proposal is made compliant with RFC5785, I would not recommend implementation.
Does anybody actually use RFC 5785 for anything? I’d never even heard about it until you posted this comment.
If there is sufficient penetration and it’s a standard that people are actually paying attention to, I certainly have no objection to using /.well-known/heartbleed.txt instead of /heartbleed.txt.
Well-known URIs are registered with IANA; in fact one was just registered yesterday: https://www.iana.org/assignments/well-known-uris/well-known-uris.xhtml
Vote this up at https://news.ycombinator.com/item?id=7559317!