I’ve seen several people recently discussing how LastPass protects your LastPass master password and your encrypted site password data (a.k.a., your vault). If what some of those people were saying were true, then LastPass wouldn’t be as secure as I thought it was. This gave me pause, since I use LastPass to store all my passwords, so I decided to do some research to try to understand for myself exactly how it works. Now that I’ve done that, it seems to me that others might benefit from my research, and in any case writing it down will clarify it in my own mind, so here it is.
Archive for the ‘Web’ Category
Those of us who help create and maintain “the internet” that everyone benefits from are now tasked with helping the world recover with one of the biggest, if not the biggest, security holes in the history of the internet.
To be certain they aren’t vulnerable, users need to change their passwords at every site that was at any point vulnerable to a Heartbleed attack. But a site has to be patched, and its SSL certificate has to be reissued with a newly generated secret key, before its password should be changed; otherwise, the new password is just as vulnerable to Heartbleed as the old one was. What’s more, you can’t just look at the start date of an SSL certificate to determine whether it was reissued, because that doesn’t tell you whether the site was patched before the certificate was deployed, and worse than that, some CAs (e.g., Digicert) quite reasonably re-key certificates without changing their original start dates.
I have passwords at over 500 sites. I’m sure there are people who use many more sites than that. Manually figuring out which sites need their passwords changed, and when to change them, and keeping track of which ones have been changed, is an impossible task.
What we need is a standard, widely adopted way for web sites to indicate, in a way that can be easily interpreted by software, whether they were ever vulnerable to Heartbleed, and if so, when the vulnerability was patched. Then browsers and password keepers such as LastPass can easily determine and track which user passwords need to be changed, and warn the user.
Yesterday, the folks over at Dilbert.com changed their RSS feed, which is what the comics aggregator was using before to fetch Dilbert, so that method no longer works. Therefore, I had to refactor the Dilbert support in the aggregator. It should be working again. Please let me know if it isn’t. And please consider supporting my work on the comics aggregator if you benefit from it.
Recently, I wrote about why my comics aggregator stopped supporting comic strips distributed by the King Features Syndicate. In a nutshell, King Features changed how their comics are distributed in a way that (a) makes it much harder for my aggregator to scrape them, and (b) makes it 100% clear that they don’t want people scraping their comics.
They’re the copyright holder for these strips, so it’s their right to decide how people are allowed to view them. Aside from the ethical concerns with intentionally bypassing their restrictions, I frankly don’t want them to sue me. So free access to their strips through my aggregator is no longer an option.
However, all of their strips are available for your viewing pleasure at Daily Ink for the low, low price of $1.99 per month or $19.99 per year. With a paid subscription to Daily Ink, you can get King Features strips daily on your smart-phone, tablet or computer, as well as in email.
If that’s good enough for you, then you can go sign up there now and stop reading. However, if you’d like to get strips from Daily Ink all in one place with the other strips you read through my aggregator, then read on.
The folks at the King Features Syndicate have always made their comics harder to aggregate than those of any of the other syndicates. I had implemented a convoluted workaround involving redirecting requests for images of their comics through the comics aggregator itself, where various special headers were inserted into the request to convince the King Features web servers to provide the requested content. That worked for quite a while, but as of today, they’ve apparently turned off their old web servers and replaced them with new ones which use a completely different protocol for requesting content.
In short, King Features comics are no longer working in the aggregator. This means that the following comics are currently broken:’
|Baby Blues||Mallard Fillmore|
|Curtis||Rhymes with Orange|
|Family Circus||Sally Forth|
|Funky Winkerbean||Zippy the Pinhead|
|Hagar The Horrible||Zits|
I’m working on restoring them as best as I can, so you will see them creep back into service one by one as I find new ways to retrieve them (it seems unlikely at this point that I will be able to use a single retrieval method for all of them as I was before), but in the end, some of them may prove to be no longer aggregatable.
I will post another update when I’ve done all that I can to get them back.
Perhaps I just don’t grok the zen of Hacker News, but I just don’t get why the site doesn’t provide RSS feeds of comments on postings.
I also don’t get why nobody has written something to provide this. Or, at least, if it has been written, I couldn’t find it (perhaps my Google mojo just isn’t high enough).
http://api.ihackernews.com/ gets part of the way there, but it’s an API, not an RSS feed, and it’s broken and has been for quite a while.
So I wrote a script to generate such a feed.
You can see it in action at http://jik3.kamens.brookline.ma.us/cgi-bin/hn-comment-feed.cgi?id=5107045 (which I will have to remove if there is so much interest in this Show HN that people clicking on the link overwhelms my tiny little VPS).
(If you want to comment about this, please do on the Hacker News posting about it rather than here.)
UPDATE [2013-01-20]: The scripts weren’t reading the username and password from the config file properly. In fact, I had accidentally hard-coded my own Vonage username and password in this script. D’oh! Needless to say, I’ve changed my password, and the updated scripts below are fixed. Also, I updated the cron script to allow the usage of Xvfb to be conditional, so you can test the cron script on your desktop and watch it running by editing the script and setting the XVFB variable to “false”.
Until today, I’ve been paying Vonage a ridiculous amount of money (almost $400) for an annual plan with unlimited minutes. It turns out they’ve got a much cheaper monthly plan, something like $14 per month even after all the bogus surcharges, taxes and fees they tack on, with 300 outbound domestic minutes and 300 outbound international minutes, and unlimited inbound minutes. We rarely exceed either of those limits in a month, so the cheaper plan is a much better deal for us, so I switched to it earlier today.
However, we do exceed the domestic minutes limit in busy months, sometimes by quite a lot, and in those months I’d really like to know when we’re on track to exceed the limit, so we can take steps to avoid paying the 5¢ per minute overage charge, like using our cell phones (which have unlimited minutes) for more of our outbound calls.
Unfortunately, Vonage doesn’t provide any sort of account alerts feature which would notify us if we’re on track to exceed our limits, or at the very least if we get close to exceeding them at any point during the month. And I certainly don’t want to have to waste my time logging into the Vonage web site to check our usage!
I therefore decided to automate this check using Selenium. Below are two scripts: a Perl script which does the actual fetching of data from the Vonage site, and a shell script which you can run out of cron to set up the necessary environment, run the Perl script, and then clean up after itself.