Wednesday, March 31, 2010

turning off the :visited privacy leak

Since I started at Mozilla, I've been trying to increase momentum on fixing the history sniffing privacy leak. I've been able to get lots of people interested, and David Baron has worked hard to come up with a fix. This is a hard problem, and the stars have finally aligned: the Firefox source code, our thinking, research, and a need have come together to get this done.

David has nearly finished an implementation of a plug for the leak, and it's a pretty nice solution that strikes a balance between privacy and utility. In the end, we're going to have to break the web, but only a little bit, and in ways we believe can be recreated with other techniques.

The fix has three parts:
  1. :visited is restricted to color changes. Any size or other types of layout/loading effects are disabled. This is foreground, background, border, SVG outline and stroke colors.
  2. getComputedStyle and similar functions will lie: all links will appear unvisited to the web site, but you'll still see the visitedness when the page is rendered.
  3. The layout code has been restructured to minimize the difference in code path for laying out visited and unvisited links. This should minimize timing attacks (though it can't remove them all).

I don't think web sites should be able to extract your browsing history without your consent; this is one of the bits of the web that rubs me the wrong way, and I'm excited we've made some progress towards removing this from the web. If it rubs you the wrong way too, and you just can't wait for our upcoming fix, you can turn off all visited links in Firefox 3.5 and newer. This breaks the web even more, but is an immediate hack if you want to hide from the sniffers.

Over the last few years, I've been collecting a list of sites that show how this trick can be abused. Hopefully all of them will stop working with the new fix!

More reading:

Friday, January 29, 2010

cookies by many different names

Cookies are great, and everyone loves them (chocolate chip are my favorite) but if we leave the Internet to its own device it could potentially drive itself into a state of udder deception where other technologies are secretly used in place of cookies for tracking and identification purposes.

Spending the past two days submerged in various privacy discussions, I've started again deeply thinking about cookies and tracking. The fundamental privacy concerns about HTTP cookies (and other varieties like Flash LSOs) come from the fact that such a technology gives a web server too much power to connect my browsing dots. Third-party cookies exacerbate this problem -- as do features like DOM storage, google gears, etc.

Come to think of it, cookies aren't unique in their utility as dot-connectors: browsing history can also be used. A clever site can make guesses at a user's browsing history to learn things such as which online bank was recently visited. This is not an intended feature of browsing history, but it came about because such a history exists.

But wait, cookies, Flash LSOs, DOM storage, and browsing history aren't uniquely useful here either! Your browser's data cache can be used like cookies too! Cleverly crafted documents can be injected into your cache and then re-used from the cache to identify you.

In fact, all state data created or manipulated in a web browser by web sites has the potential to be a signal for tracking or other dot-connecting purposes. Even if the state change seems to be write-only there could be other features that open up the other direction (e.g., the CSS history snooping trick mentioned above -- or timing attacks).

Stepping Back and thinking about these dot-connecting "features" in the context of the last couple days' privacy discussions has got me wondering if there's not a way we can better understand client-side state changes in order to holistically address the arbitrary spewing of identifying information. I think the first step towards empowering users to protect themselves better online is to understand what types of data is generated by or transmitted by the browser, and what can be used for connecting the dots. After we figure that out, maybe we can find a way to reflect this to users so they can put their profile on a leash.

But while we want to help users maintain the most privacy possible while browsing, we can't forget that many of these dot-connecting features are incredibly useful and removing them might make the Web much less awesome. I like the Web, I don't want it to suck, but I want my privacy too. Is there a happy equilibrium?

How Useful is the web with cookies, browsing history and plug-ins turned off? Can we find a way to make it work? There are too many questions and not enough answers...

Monday, December 14, 2009

sluggish xorg

I have been fighting with what I thought was a really slow window manager, and so I changed to a lighter weight one and it still took forever to draw things. After fiddling with stuff on and off for a few months, it turns out to be pretty simple: the radeon driver decided to use the CPU for too much of its own job.

I set a couple of flags (thanks to linportal), and everything is speedy again. So if you're fighting with a radeon driver that seems to be worthless (especially with multiple displays) try setting the "MigrationHeuristic" option to "greedy" in your xorg.conf's device section.

Option "MigrationHeuristic" "greedy"

Friday, November 20, 2009

update on HTTPS security

Version 2.0 of my Force-TLS add-on for Firefox was released by the AMO editors on Tuesday, and in incorporates a few important changes: It supports the Strict-Transport-Security header introduced by PayPal, and also has an improved UI that lets you add/remove sites from the forced list. For more information see my Force-TLS web site.

On a similar topic, I've been working to actually implement Strict-Transport-Security in Firefox. The core functionality is in there, and if you want to play with some demo builds, grab a custom built Firefox and play. These builds don't yet enforce certificate integrity as the spec requires, but aside from that, they implement STS properly.

The built-in version performs an internal redirect to upgrade channels -- before any request hits the wire. This is an improvement over the way the HTTP protocol handler was hacked up by version 1 of Force-TLS, and doesn't suffer from any subtle bugs that may pop up due to mutating a channel's URI through an nsIContentPolicy. I'm not sure that add-ons can completely trigger the proper internal redirect, since not all of the HTTP channel code is exposed to scripts, and add-ons would need to replicate some of the functions compiled into the nsHttpChannel, opening up a possibility of obscure side-effects if the add-on gets out of sync with the binary's version of those functions.

Edit: The newest version of NoScript does channel redirecting through setting up a replacement channel in a really clever way -- pretty much the same as my patch. It replicates some of the internal-only code in nsHttpChannel, though, and it would need to get updated in NoScript if for some reason we change it in Firefox.

Thursday, November 12, 2009

OWASP AppSec DC '09

I'm at OWASP AppSec DC '09 this week. If you're there too, come find me and say hi!

Monday, October 12, 2009

csp @ stanford security seminar

I'll be giving a talk at the October 13 Stanford Security Seminar. 4:30pm in Gates 4B. Show up if you're interested in CSP or want to heckle!

Friday, October 02, 2009

CSP Preview!

Brandon Sterne and I released a preview of Firefox with Content Security Policy features built in. There are still little bits of the specification that aren't yet ready (like HTTP redirection handling), but most of the core functionality should be there.

If you'd like to play around with this pre-release version of Firefox (very alpha, future release) that has CSP built in, download it here! You can test it out at Brandon's demo page.

In case you're not familiar with CSP, it's a content-restriction system that allows web sites to specify what other types of stuff can be embedded on their pages and where it can be loaded from. It's very similar to something called HTTP Immigration Control that I was working on in grad school, so I'm very exited to be part of the design, specification and implementation -- hopefully a big step towards securing the web.

Previously: Shutting Down XSS with Content Security Policy and CSP: With or Without Meta?

Update: The old download link expired. New one should have a much longer lifetime (here).

Thursday, September 17, 2009

notawesome

While discussing privacy and Firefox 3.5 with Chris a couple weeks ago, we stumbled upon the thought that people might want to be able to select which bookmarks show up when they're given automatic suggestions in Firefox 3's Awesome Bar. This discussion really started with a bit of public metrics and discussion in the blogosphere.

In mid August, Ken Kovash wrote about reasons users gave for not upgrading from Firefox 2 to Firefox 3.0. The number one reason was, surprisingly, the Awesome Bar. Without going into detail, the gist was that people didn't really want certain bookmarks to show up when they start typing URLs.

Perhaps the settings weren't obvious enough, but users can set the awesome bar to search only bookmarks, only history, both, or neither (Alex Faaborg discussed it in June, in fact).

Here's the use case: Bob bookmarks a couple porn sites, then during a public presentation, he starts typing "www" in the URL bar. His porn sites show up in the suggestion list, and everyone in the audience gasps.

The work-arounds for this I see are:

  1. Use a separate browser for "private" sites.
  2. Use a separate Firefox profile for browsing "private" sites.
  3. Use Private Browsing when browsing "private" sites (but then you can't bookmark the sites).
  4. Turn off bookmarks and/or history searching for awesome bar.

But maybe this isn't good enough for everyone. Some folks might want to just hide a couple of bookmarks from the awesome bar. We need a way to make certain bookmarks "not awesome" so they won't show up.

Enter bookmark tags... you can add tags to bookmarks to find them easily. Why not tag bookmarks with "notawesome", then somehow hide those from the awesome bar search?

On a whim, I hacked together a quick addon to do this: notawesome!

lifehacker picked up on this (dunno how they found it buried in AMO), and apparently some folks find it useful.

To those 800 people using it already: thanks for trying it out, and your comments! I'll see if I can find some time to make it better. If anyone else wants to hack on it, let me know...