Archive for the ‘Web’ Category

How LastPass protects your data

Thursday, April 10th, 2014

I’ve seen several people recently discussing how LastPass protects your LastPass master password and your encrypted site password data (a.k.a., your vault). If what some of those people were saying were true, then LastPass wouldn’t be as secure as I thought it was. This gave me pause, since I use LastPass to store all my passwords, so I decided to do some research to try to understand for myself exactly how it works. Now that I’ve done that, it seems to me that others might benefit from my research, and in any case writing it down will clarify it in my own mind, so here it is.

(more…)

We need a “/heartbleed.txt” standard, and we need it ASAP

Wednesday, April 9th, 2014

Heartbleed LogoThose of us who help create and maintain “the internet” that everyone benefits from are now tasked with helping the world recover with one of the biggest, if not the biggest, security holes in the history of the internet.

To be certain they aren’t vulnerable, users need to change their passwords at every site that was at any point vulnerable to a Heartbleed attack. But a site has to be patched, and its SSL certificate has to be reissued with a newly generated secret key, before its password should be changed; otherwise, the new password is just as vulnerable to Heartbleed as the old one was. What’s more, you can’t just look at the start date of an SSL certificate to determine whether it was reissued, because that doesn’t tell you whether the site was patched before the certificate was deployed, and worse than that, some CAs (e.g., Digicert) quite reasonably re-key certificates without changing their original start dates.

I have passwords at over 500 sites. I’m sure there are people who use many more sites than that. Manually figuring out which sites need their passwords changed, and when to change them, and keeping track of which ones have been changed, is an impossible task.

What we need is a standard, widely adopted way for web sites to indicate, in a way that can be easily interpreted by software, whether they were ever vulnerable to Heartbleed, and if so, when the vulnerability was patched. Then browsers and password keepers such as LastPass can easily determine and track which user passwords need to be changed, and warn the user.

(more…)

Dilbert was broken, now fixed in comics aggregator

Wednesday, June 26th, 2013

Yesterday, the folks over at Dilbert.com changed their RSS feed, which is what the comics aggregator was using before to fetch Dilbert, so that method no longer works. Therefore, I had to refactor the Dilbert support in the aggregator. It should be working again. Please let me know if it isn’t. And please consider supporting my work on the comics aggregator if you benefit from it.

Johnny Monsarrat link round-up

Tuesday, June 11th, 2013

MIT Peeps: Have you heard what JonMon is up to nowadays?

Monday, May 6th, 2013

This post is mostly for the benefit of my fellow MIT alumni.

Remember Jonathan Monsarrat?

If so, then you may have gotten the same impression of him back at the ‘tute that I did, to wit, that he gave off creep vibes.

Judging from what has been written about him online since then, he’s done other things to reinforce that vibe, but this just about takes the cake… He has recently filed a lawsuit against several people, including Ron Newman whom many of you probably also know, alleging that they defamed him in discussions that took place in various forums on LiveJournal almost three years ago.

The lawsuit is clearly baseless and doomed to fail if it makes it to trial, but it seems likely that JonMon isn’t so much interested in prevailing in court, but rather in scaring people into removing their LiveJournal postings by raising the specter of an expensive legal defense.

Ken White has put up the Popehat signal asking for pro bono legal help defending against this attempt to suppress people’s free speech.

I’ve known Ron online for over two decades. He’s a good guy, and if he ends up needing financial help defending himself against this suit, you bet I’ll be contributing to his legal defense fund.

If there’s anything you can do to help — whether it’s providing pro bono assistance yourself, hooking Ron up with somebody who can provide such assistance, or just kicking in a few bucks to his legal defense fund if one is set up — then I encourage you to do so. And if you’re an MIT alum, make sure to mention it when you help…. It’s important to let people know that JonMon’s conduct is considered neither normative nor acceptable to most of the MIT community.

 

Comics aggregator supports King Features again, sort of

Wednesday, April 3rd, 2013

Recently, I wrote about why my comics aggregator stopped supporting comic strips distributed by the King Features Syndicate. In a nutshell, King Features changed how their comics are distributed in a way that (a) makes it much harder for my aggregator to scrape them, and (b) makes it 100% clear that they don’t want people scraping their comics.

They’re the copyright holder for these strips, so it’s their right to decide how people are allowed to view them. Aside from the ethical concerns with intentionally bypassing their restrictions, I frankly don’t want them to sue me. So free access to their strips through my aggregator is no longer an option.

However, all of their strips are available for your viewing pleasure at Daily Ink for the low, low price of $1.99 per month or $19.99 per year. With a paid subscription to Daily Ink, you can get King Features strips daily on your smart-phone, tablet or computer, as well as in email.

If that’s good enough for you, then you can go sign up there now and stop reading. However, if you’d like to get strips from Daily Ink all in one place with the other strips you read through my aggregator, then read on.

(more…)

King Features Syndicate comics currently unavailable

Friday, February 8th, 2013

The folks at the King Features Syndicate have always made their comics harder to aggregate than those of any of the other syndicates. I had implemented a convoluted workaround involving redirecting requests for images of their comics through the comics aggregator itself, where various special headers were inserted into the request to convince the King Features web servers to provide the requested content. That worked for quite a while, but as of today, they’ve apparently turned off their old web servers and replaced them with new ones which use a completely different protocol for requesting content.

In short, King Features comics are no longer working in the aggregator. This means that the following comics are currently broken:’

6Chix Lockhorns
Baby Blues Mallard Fillmore
Beetle Bailey Mutts
Bizarro Phantom
Blondie Prince Valiant
Curtis Rhymes with Orange
Family Circus Sally Forth
Flash Shermans Lagoon
Funky Winkerbean Zippy the Pinhead
Hagar The Horrible Zits

I’m working on restoring them as best as I can, so you will see them creep back into service one by one as I find new ways to retrieve them (it seems unlikely at this point that I will be able to use a single retrieval method for all of them as I was before), but in the end, some of them may prove to be no longer aggregatable.

I will post another update when I’ve done all that I can to get them back.

CGI script for RSS feed of comments on Hacker News posting

Thursday, January 24th, 2013

Perhaps I just don’t grok the zen of Hacker News, but I just don’t get why the site doesn’t provide RSS feeds of comments on postings.

I also don’t get why nobody has written something to provide this. Or, at least, if it has been written, I couldn’t find it (perhaps my Google mojo just isn’t high enough).

http://api.ihackernews.com/ gets part of the way there, but it’s an API, not an RSS feed, and it’s broken and has been for quite a while.

So I wrote a script to generate such a feed.

You can see it in action at http://jik3.kamens.brookline.ma.us/cgi-bin/hn-comment-feed.cgi?id=5107045 (which I will have to remove if there is so much interest in this Show HN that people clicking on the link overwhelms my tiny little VPS).

(If you want to comment about this, please do on the Hacker News posting about it rather than here.)

 

Using Selenium to monitor your Vonage balance

Thursday, January 17th, 2013

UPDATE [2013-01-20]: The scripts weren’t reading the username and password from the config file properly. In fact, I had accidentally hard-coded my own Vonage username and password in this script. D’oh! Needless to say, I’ve changed my password, and the updated scripts below are fixed. Also, I updated the cron script to allow the usage of Xvfb to be conditional, so you can test the cron script on your desktop and watch it running by editing the script and setting the XVFB variable to “false”.


Until today, I’ve been paying Vonage a ridiculous amount of money (almost $400) for an annual plan with unlimited minutes. It turns out they’ve got a much cheaper monthly plan, something like $14 per month even after all the bogus surcharges, taxes and fees they tack on, with 300 outbound domestic minutes and 300 outbound international minutes, and unlimited inbound minutes. We rarely exceed either of those limits in a month, so the cheaper plan is a much better deal for us, so I switched to it earlier today.

However, we do exceed the domestic minutes limit in busy months, sometimes by quite a lot, and in those months I’d really like to know when we’re on track to exceed the limit, so we can take steps to avoid paying the 5¢ per minute overage charge, like using our cell phones (which have unlimited minutes) for more of our outbound calls.

Unfortunately, Vonage doesn’t provide any sort of account alerts feature which would notify us if we’re on track to exceed our limits, or at the very least if we get close to exceeding them at any point during the month. And I certainly don’t want to have to waste my time logging into the Vonage web site to check our usage!

Sai Charan solved this with a Python script, but I couldn’t use his script for two reasons: (1) it complained about a missing “libxmlmods” Python module, which I can’t figure out where to get; and (2) Vonage recently rolled out a new web site, and I’m not convinced his script works with the new site. Not only that, but the new site seems to depend on JavaScript being enabled to work properly, so I don’t think a simple screen-scraping script like Sai’s is going to work anymore (I tried writing a script like this using WWW::Mechanize in Perl, and couldn’t get it to work).

I therefore decided to automate this check using Selenium. Below are two scripts: a Perl script which does the actual fetching of data from the Vonage site, and a shell script which you can run out of cron to set up the necessary environment, run the Perl script, and then clean up after itself.

(more…)

xkcd.com added to comics aggregator

Monday, January 14th, 2013