Showing posts with label firefox. Show all posts
Showing posts with label firefox. Show all posts

Wednesday, March 06, 2013

Who uses the password manager?


Who uses the password manager, and why? My colleague Monica Chew tries to answer these questions and more by measuring password manager use. 

Check out her blog post.

Monday, March 12, 2012

making DNT easier for web sites

Jos Boumans has done some analysis about the effect of turning on Do Not Track in your browser, and his findings show that sites in general are slow to show that they support the feature.
"As it stands, only 4 out of 482 measured top 500 sites are actively responding to the DNT header being sent." (Link)
As a user, it's hard to tell if sites are honoring my Do Not Track request, and as a site developer, it might be a daunting task to hack up my back-end code.  The W3C Tracking Protection working group at the W3C are working on helping out transparency and implementations, but in the meantime Jos has released his mod_cookietrack apache module to make it easier for site owners to track their users' clicks in a respectful way -- right now.
The Apache module, mod_cookietrack, does all sorts of stuff like mod_usertrack, but one thing it does better is honor DNT; if a server using this module sees "DNT: 1" in an HTTP request, it replaces the tracking cookie with one that says "DNT" -- something that's not unique to a visitor.

Apparently it was a lot of work to get DNT supported properly in mod_cookietrack, a native browser module that performs well and is safe on multiple threads, so thanks Jos for your hard work so that more organizations can support DNT on their web sites.

More:

Sunday, February 26, 2012

Malware and Phishing Protection in Firefox

For a while, Firefox has included malware and phishing protection to keep our users safe on the web.  Recently, Gian-Carlo Pascutto made some significant improvements to our Firefox support for the feature, resulting in much more efficient operation and use of the Safe Browsing API for this protection.

Privacy in the Safe Browsing API

I want to take a little time to explain how this feature works and why I like it from a privacy perspective:  Firefox can check whether or not a web site is on the Safe Browsing blacklist without actually telling the API what the web site is called.

At a high level, using this API to find URLs on the "bad" list is like asking your friend to identify whether or not he likes things you show him through a dirty window.  Say you hold up an apple to the dirty window and the your friend on the other side sees a fuzzy image of what you're holding.  It looks round and red and pretty small, but he's not sure what it is.  Your friend looks at his list of things he doesn't like and says he likes everything like that except for plums and red tennis balls.  While he still does not know exactly what you're holding, you can know for sure he likes the apple.

More technically, this uses a hash function to turn web URLs into numbers.  Each number corresponds to exactly one URL.  For each site you visit, Firefox hashes the URL and sends the first part of the resulting number to the Safe Browsing API.  The API responds with any values on the list of bad URLs that start with the value it received.  When Firefox gets the list of "bad" site hash values that match the first part, it looks to see if the entire hash is in the list.  Based on whether or not it's in the provided list of bad stuff, Firefox can determined whether the URL is on the Safe Browsing blacklist or not.

Consider this hypothetical example of two sites and their (fake) hash values:

SiteHash Value
http://mozilla.com1339
http://phishingsite.com1350

When you visit http://mozilla.com, Firefox calculates the hash of the URL, which is 1339.  It then asks the Safe Browsing API what bad sites it knows about that start with "13".  It returns a list of numbers including "1350".  Firefox takes that list, notices that 1339 (http://mozilla.com) is not in the list, so the site must be okay. 

If you repeat the same procedure with http://phishingsite.com, the same prefix "13" is sent to the API, and the same list of bad sites (including 1350) is returned.  In this case, however,  the site's hash is "1350" so Firefox knows it's on the list of bad sites and gives you a warning.

For you techies and geeks out there: yeah, I'm glossing over a few protocol details, but the gist is that you don't need to tell Google exactly where you browse in return for the bad-stuff blocking. 

Keeping the Safe Browsing Service Running Smoothly

Google hosts the Safe Browsing service on the same infrastructure as many of their other services, and they need to ensure that our users aren't blocked from accessing the malware and phishing blacklists as well as make sure they invest in the right resources to keep the service operating well.  One of the mechanisms they need for performing this quality-of-service assurance is a cookie, so the first request Firefox makes to the Safe Browsing API results in the setting of a Google cookie.

I know that not everyone likes that cookie, but Google needs it to make sure their service is working well so I've been working with them to ensure that they can use it for quality of service metrics but not track you around the web.  The most straightforward way to do this is to split the Firefox cookie jar into two: one for the web and one for the Safe Browsing feature.  It's not there yet, but with a little engineering work, in a future version of Firefox that cookie will only be used for Safe Browsing, and not sent with every request to Google as you browse the web.

The cookie can be turned off entirely if you disable third party cookies in Firefox.  When you turn off third party cookies, even if the cookie has been previously set your browser will not send the Google cookie -- unless you visit a Google website. You can also turn off malware and phishing protection, but I really don't recommend it.

Making "Safer Browsing"

While Firefox has been using Safe Browsing for a while, Google has started experimenting with a couple new features in Safe Browsing for additional malware and phishing filtering.  Both of these new features are pretty new and it's not yet clear how effective they are or what percent of my browsing history will be traded for this improvement.  Both new features involve sending whole URLs to Google and departing from Firefox's current privacy-preserving state requires evidence of a significant gain in protection. When Google measures and shares how much gain is encountered by their pilot deployment in Chrome, we can take a deeper look and consider whether these new features are worth it.

For now, Firefox users are getting a lot of protection for very little in return and there does seem to be good reason for Google to use cookies with Safe Browsing.  We are always looking out for things we can do to give Firefox users both the best of privacy and security.

Monday, December 19, 2011

seat belts and airbags

As much as I like giving users choice and control, bombarding people with too many options makes using software painful.  This is why it is important to consider both defaults and flexibility of all the privacy-impacting features we roll out -- the airbags and seat belts of the software industry.  Not everyone who cares about privacy know how to configure Firefox (or any software) to precisely suit their needs.  Those who are both care about their privacy and know how to configure software to precisely what they want are not the same; those with both qualities are often Privacy Professionals, or they work in a related field.


A couple of weeks ago, I was inspired by some stuff Dr. John Guelke  said to segment my thinking on privacy into two efforts: the privacy feature seat belts and airbags.  He approaches privacy as something driven by social norms, whereas until recently, I mostly thought about it as a subjective choice about what I want with my identity and data.  In fact, both of these perspectives are important, and they must work together to create the most positive effect for the Web.  There are distinctly different reasons to provide certain safe defaults than there are to provide features users ultimate control: the airbags can help protect everyone†, and the seat belts†† will protect those who know to use them.

Choice and Control (Seat Belts).

It's crucial that people have all they need to maintain complete control over their experiences online, or the web becomes controlled solely by the businesses on it and not the people who live in it.  Increasingly, people are performing more of their everyday activities online and deserve to be as much a part of their activities as they would in the real world and this is why I care so much about giving people who want it control over each bit of how they see and interact with the web.  This is the reason Do Not Track was built into Firefox, and this is why software allows people change how the browser handles cookies. These features empower users to control their experiences online.

Users enable and deploy these features on their own.  Firefox doesn't turn on Do Not Track by default, because it's a seat belt.  People choose if they want it or not.

Social Norms (Airbags).

There are expectations about what people understand that are consistently held by a society or group.  These social norms dictate expected behavior and, though not something that limit behavior, can be seen as social defaults.  These norms change and fluctuate with the society, but you could say they are precisely what any member of the society expect to happen.

The Web is a society of sorts, and people carry over their social norms from physical interactions with people to those interactions with web sites and corporate entities online.  Here is where the social norms very importantly dictate the defaults of how a web browser should work (and frankly, how web sites should work too).  People expect a site to remember small bits of information about their interactions, such as what is in their shopping cart, and this is why cookies are enabled by default, like an airbag.  People do not expect to disclose their precise location to web sites, and that is why Geolocation is not activated by default.



Directing Efforts.


There are two driving forces here that dictate the best paths forward for inventing and building privacy features into the web: social norms, and individual choice.   It's easier to listen to the cry or predict a need for individual choice; we can create any feature as if it were a seat belt -- features that users may or may not want to enable.  The harder direction is understanding and following social norms, or what people expect without request or action.  These are hard because they differ not only with time, but also across different groups of people.  Technologists like me can more easily understand our subculture's values and build those into our software.  We have to be careful, though, since society as a whole may not have the same values as our smaller group of software developers.  We as an industry need to focus on what benefits all as a sensible default, and that may be completely opposite of what we computer geeks think.

We need a better understanding of social norms and how they relate to people's data online.  That understanding can help map norms to the defaults we build into all the web-oriented software we make.  Everything else then should then be an optional feature, like a seat belt.

Though you may not use all of Firefox's privacy features, I do recommend wearing your seat belt.  Really.  It could save your life.  :)


--- Footnotes:---

† = Okay, so the analogy breaks down since airbags aren't good for you unless your seat belt is engaged, but the gist is that you don't have to think about the airbags.

†† = And sure, "everyone" knows about seat belts, but pretend for this argument that they don't and the feature is more like those glass-breaking hammers that you can buy to free you from a submerged car; you can buy and use them, but they don't usually come with your car.

Wednesday, November 09, 2011

Firefox won't activate DNT by default

Firefox isn't gonna turn on DNT by default because then DNT won't work.
"As Do Not Track picks up steam and standardization is well underway in the W3C, people have begun asking, "If Do Not Track is so good for the web, why don't you turn it on by default?"
"Frankly, it becomes meaningless if we enable it by default for all our users. Do Not Track is intended to express an individual's choice, or preference, to not be tracked. It's important that the signal represents a choice made by the person behind the keyboard and not the software maker, because ultimately it's not Firefox being tracked, it's the user. "

(Link)
Sure, we could run a few engagement campaigns to inform people about the option, but we won't make that decision for our users.

Edit (9-Nov-2011 @ 11:24): 

There are three different signals to consider in broadcasting the user's preferences for tracking:

1. User says they accept tracking
2. User says they reject tracking
3. User hasn't chosen anything

We're defaulting to state 3: we don't know what the user wants, so we're not sending any signals to servers.  The signal being sent should be the user's choice, not ours, so we don't broadcast anything until they've chosen what to send.

Wednesday, September 28, 2011

Measuring Progress

In the reasonably short time that I've been involved with Mozilla, we've made amazing changes to the web and our Firefox browser. We've seen the adoption of HTML5, open video, and a slew of other features. This means the web is yet even more complex and by extension, so is Firefox.

Sometimes Firefox doesn't perform as well as it should, and it's hard for us to understand why.

Enter the Telemetry project. Our performance team, led by Taras Glek, developed a feature that lets us measure performance-related stuff as you use Firefox. Starting with the version of Firefox released today, you have the opportunity to opt-in to send us some of these statistics. They're not tied to you, and we will take a look at the data in aggregate to see if there are widespread problems in the various bits of Firefox's plumbing.

I posted a note about this over on The Mozilla Privacy Blog. As we deployed this feature, we worked hard to make sure that our users will have choice and control of the data they send us. This involves a few bits of critical thinking: first, we have to make sure you're not surprised about this.  Second, we make sure that we're only collecting what we need to make Firefox better. Third, our practices must be transparent (and not just open source, like we try to be clear about what we collect).  Fourth, we make sure that you know you're sending us this data and can make it stop if you want.

We wrote down how telemetry works for you to read (if you want) and how the feature lines up with our promises put forth in the Privacy Operating Principles that we've been working with for a while now. As we add new probes to telemetry to see where to improve Firefox, we'll be cataloging those as well, including risk analysis for stuff that's remotely private.  We'll never collect stuff like your address or credit card numbers through this system (that'd be weird), but we may want to know which of the add-ons you're using that are slowing down Firefox.

This risk analysis and privacy review are the things we plan to do with new Firefox features that involve your data; whether or not we collect anything, it's important that we live up to the operating principles we've put out, and Telemetry is an early example of how we plan to keep you in control.

Thursday, September 22, 2011

Careful... pixel-data access is pointy

Robert O'Callahan writes:
Some Web applications require the pixel data of Web pages to be exposed to Web applications [...] There are some pretty big security implications here. The biggest problem is cross-origin information leakage.
He's right on. This has a bunch of subtle risks to haphazardly implementing pixel-data access. The one near and dear to my heart is the risk of defeating what we shipped a while back to stop the CSS- and JavaScript-based history sniffing. Draw links, read colors, defeat fix. Not good. We can't just lie to the content script attempting to access the rendered data -- once it's drawn, it's really hard to figure out what's a link and what isn't. So what do we do? Take a look at this and the other issues with implementing pixel-data access over on his blog. If you've got ideas, we're all ears.

Thursday, July 14, 2011

on unifying site behavior and consent

Lets face it, the users of your ShinyNewWebSite(beta) will never know exactly how it works. Perhaps that's by design (look, it's magic!), perhaps that's simply because they're not computer programmers, but this is the reality.

So there's this problem: how do I get users to provide informed consent to use my shiny new data collection web site? I want to do some really cool stuff, but I want the users of the site to know what's happening and feel in control.

This is hard. I think there's a ton of value in data mining and personalization, and it's not reasonable to expect users to comprehend the entire process of how their data is collected and used. We do however need to empower users to manage trust for the organizations who collect and use their data, and one way to do this is to get them closer to understanding what happens.

Here's one way I've been thinking about this: on one end of a spectrum are the users; they have values and want to assert protection over some of their data. On the other end of the spectrum are the web sites; they produce value from the users' data and want to be honest and compliant with users' desires. Right now there's often a huge gap between what users want and what sites actually do with their data. We need to shrink this gap.


I've talked about this gap from a user's perspective before (the privacy perception gap) and ultimately this gap leads to shock and discomfort. In Firefox 4, we deployed DNT as one feature to help shrink the gap from the user's informed-consent side.


Anything we can do to help make obvious users' preferences and privacy choices shrinks the gap from the user side, but we should work from the site's side as well, and hope the efforts meet somewhere in the middle. What else can we do to help bring site behavior into to the user's mental model of what's going on?


We need something new to improve upon privacy policies. We need something more objective than self-explanation. We need something empirical that can be measured, digested and shown to users. We need technology that makes it easier for people to peer into the opaque bits of the web and see what data is collected and how it's used. While it's not realistic to expect a silver bullet that makes all users instantly understand how sites work, we should still try hard; let's throw all the ideas that we have out on the table and approach this gap with as many tools as we have to try and shrink it.

Thursday, May 26, 2011

managing your relationship with sites

This post is co-written by Margaret Lebovic and Sid Stamm. This article is cross-posted on Margaret's blog

As the web becomes more and more complex (and AWESOME), it's important that you can manage your relationship with the variety of sites out there. Sure, Firefox 4 has a Page Info dialog that lets you control what a web page is allowed to do, including whether you want to let it store data on your computer, access your location information, open pop-up windows, and on and on. However, this dialog only lets you manage your relationship with the one page you're currently visiting, not the entire set of sites you visit on the web.

We think it's important to be able to manage your whole relationship with web sites in an intuitive way, and that's why we're exited to show you what we've started working on: a site-based permissions interface.


This feature is still experimental, but you can give it a shot. In the future, we'll be putting some polish on the UI, adding more controls like "always access securely" (HSTS), and hopefully giving you a better view of what a site knows about you. We also want to integrate this permissions manager with the site identity block in the location bar for quick and easy access.

Try it out! Grab a Firefox nightly build and try out the feature by typing about:permissions into the location bar.

(Credit: thanks to Jennifer Boriss, Medhi Mulani and Margaret for all the hard work on this project.)

Friday, May 20, 2011

Do Not Track -- Now on Firefox Mobile!

Since we first announced our implementation of the Do Not Track HTTP
header
, we've seen an amazing amount of support from trade groups, and even other browser makers.
To build on our view that you should have control of how you're tracked
on not only desktop computers but also your mobile devices, we're
excited to announce that the latest beta of Firefox for Android also includes this feature.

You can enable Do Not Track in the latest beta of Firefox for Android through an
easy-to-find switch in the preferences--see image to the right, and websites will see exactly the same signal that Do Not Track-enabled desktop browsers send. Every time Firefox loads a web page, image, or advertisement it includes a "DNT: 1" signal that tells the entire web you don't want to be tracked.

The web on your phone should be the same web as on your desktop, so to
provide this consistency we've put the exact same Do Not Track feature
in both the desktop and mobile versions of Firefox.

Try it out today! Grab the latest beta of Firefox for Android and turn on the feature. If you visit my blog from Firefox (mobile or desktop) with Do Not Track turned on, the widget below will glow green just for you.

Sunday, May 15, 2011

Clearing Flash cookies using Firefox

Back in March, we shipped Firefox 4 with a feature that sends a signal to plugins like Flash and Silverlight when you clear your cookies. Adobe has announced that starting with Flash Player version 10.3, they'll be listening to the signal! This is exciting, because clearing your flash cookies is as easy as clearing regular cookies in this latest version of flash.

Here's when Firefox 4 tells Flash Player version 10.3 to delete LSOs (Flash cookies):
  • When you clear all your cookies in Firefox using "clear recent history" [how-to link]
  • When you choose "forget about this site" in your library (history) window [how-to link]
  • When you quit Firefox, if you have Firefox configured to clear your cookies automatically upon exit [how-to link]

Chrome and Internet Explorer are also supporting this behavior, so this is fantastic news for everyone's privacy on the web!

More reading for techies:

Monday, February 07, 2011

Get your DNT header for older versions of Firefox!

When we recently announced our intent to add a do not track header to Firefox, we focused on how it will probably be available in a future version -- Firefox 4.0. But what about people who would prefer to use previous versions of Firefox? How can you get the HTTP header into version 3.6, or even earlier versions?

Though we recommend using our latest and greatest product, there's an add-on you can install to add the "DNT: 1" header to older versions: Universal Behavioral Advertising Opt-Out (a.k.a. UBAO). The name is a mouthful, but its operation is simple: installing this add-on is like ticking the checkbox in new versions of Firefox to send a "DNT: 1" HTTP header with all requests your browser sends out.

There are other add-ons that send the header! AdBlock Plus and NoScript send the header too, but if you don't want the extra features that come along with those add-ons, UBAO is for you.

Monday, January 31, 2011

Try out the "Do Not Track" HTTP header

Last week, I blogged about some of the work we're doing at Mozilla to help people better control how they're tracked as they browse the web. The basic idea was to give people a universal "opt out" of tracking for behavioral advertising. A Firefox user will be able to check a box in the preferences dialog and then a HTTP header would be sent with all HTTP requests so all servers know the user wants to opt out.

Well, I'm excited to report that we've landed the first iteration of this feature into Firefox nightly builds (the pre-beta builds that are rough around the edges)! If you'd like to try out the feature, grab a nightly build; I must warn you though, these nightlies are not as stable as the beta releases.

In the build, to enable the feature, open the preferences pane and select the advanced tab. Tick the box that says "Tell sites I do not want to be tracked" and start browsing.




Every connection your browser makes to download content will send a signal that says "don't track me." Literally, it looks like this to servers:
DNT: 1
Note: this is different from the initial experiment that used "X-Do-Not-Track" and my original post last week that said "Tracking-Preference: do-not-track"; it's both shorter and very precise. The researchers at donottrack.us are also recommending this syntax.

I encourage you to try out the test builds, or if you'd like to wait for a more stable version, wait for an upcoming beta release with the feature in it. We do not anticipate that sites are looking for the signal yet, so you probably won't notice a difference as you browse the web. I'm hoping to have a demo site available shortly that will give you an example of what types of changes you might see using this feature -- and when I do, I'll post a link here.

Friday, November 12, 2010

bah. blacksheep.

It's a deeply satisfying way to defeat an attacker: give him a taste of his own medicine, and the BlackSheep add-on attempts to do just that. This add-on listens for Firesheep's patterns on the network and warns a user that they are being watched. BlackSheep drops juicy bait for Firesheep (fake session cookies wrapped in legitimate-looking requests to sites) and waits for Firesheep to pick them up. When a Firesheep attacker tries to exploit these fake insecure sessions, BlackSheep picks up on it and alerts its user.

While there are some Firesheep users that will set off this alarm, the problem is a bit more serious. Many attackers have probably looked deeper into the fundamental flaw and also want to protect themselves from such an attack. These folks will be using a security-forcing add-on like Force-TLS, STS-UI on Firefox 4, HTTPS Everywhere or NoScript. All of the protected bad guy's connections to vulnerable sites will be secure, including the sessions that have been copied from other users on the network.

The upshot is that while BlackSheep may see the attacker connecting to vulnerable sites, it won't always know when the bait was taken. For many serious Firesheep-like attacks, BlackSheep will remain quiet.

There are many Firesheep users who are just playing around with the new, hot hacking tool, and so BlackSheep will cry out some of the time; it can't however be relied on to detect most Firesheep attacks. While I like the idea of an alarm like BlackSheep, solving the underlying problem (and stopping serious attackers that are more of a cause for concern) is a bit more complex and requires a different approach. Ultimately, we need to secure wifi networks, protect ourselves with HTTPS-forcing technologies, and ask sites to protect their users by using HTTPS for the whole session.

Tuesday, October 26, 2010

Managing HSTS Data

I blogged about HTTP Strict-Transport-Security before and how it's all new and shiny in Firefox 4. With all the happy firesheep attacks on the horizon, it's made it even more important that sites start using HSTS.

In case you don't want to wait for your favorite sites to start deploying strict-transport-security, here's a way for you to enable it yourself. I whipped up a quick add-on proof of concept that lets you add and remove HSTS data.

There are two ways to manage HSTS data for sites using this add-on:
  1. Navigate to an HTTPS page, open the page info dialog, and tick the "Always access content from this site securely" box
  2. Choose the "Manage Strict-Transport-Security..." item from the Tools menu, and enter the host names for your favorite sites there.

Let me know what you think!

UPDATE: Instead of maintaining the add-on in parallel with Force-TLS, I've decided to adapt Force-TLS to use the HSTS bits built into Firefox 4 and show you the same UI. Instead of the STS-UI add-on, try installing Force-TLS!

Thursday, August 26, 2010

HTTP Strict Transport Security has landed!

It was a year ago now that I first blogged about ForceTLS, and it's matured quite a bit since. I revised ForceTLS to be more robust, and began actually implementing it as HTTP-Strict-Transport-Security in Firefox. I'm excited to say that my patch has been reviewed and landed in mozilla-central.

What's that mean? Look for it in the next beta release of Firefox 4! If you can't wait, grab a nightly build, but when 4.0 is released, HTTP Strict-Transport-Security will be built-in and turned on by default.


Though the feature's core functionality is there, work on HSTS is not completely finished. There are still a few tweaks I'd like to make, mainly providing a decent UI so people can add/remove HSTS state for servers themselves -- but none of this is necessary to be specification compliant. As landed, HSTS is the behind-the-scenes implementation that listens to the HTTP Strict-Transport-Security header and follows those instructions.


In case you don't feel like trawling through the IETF Internet Draft specification but you want to figure out how it works, here's a quick summary:


  1. Over an HTTPS connection, the server provides the Strict-Transport-Security header indicating it wants to be an HSTS host. It looks something like this:
    Strict-Transport-Security: max-age=60000
    The header's presence indicates the server's desire to be an HSTS host, and the max-age states for how many seconds the browser should remember this.
  2. For an HSTS host (e.g., paypal.com), any subsequent requests assembled for an insecure connection to that host (http://paypal.com), will be rewritten to a secure request (https://paypal.com) before any network connection is opened.
  3. Optionally, the header can include a second includeSubdomains directive that tells the browser to additionally "upgrade" all subdomains of the serving host. That looks like this:
    Strict-Transport-Security: max-age=60000; includeSubdomains

If Firefox knows your host is an HSTS one, it will automatically establish a secure connection to your server without even trying an insecure one. This way, if I am surfing the 'net in my favorite cafe and a hacker is playing MITM with paypal.com (intercepting http requests for paypal.com and then forwarding them on to the real site), either I'll thwart the attacker by getting an encrypted connection to paypal.com immediately, or the attack will be detected by HSTS and the connection won't work at all.

There are more details in the specification, like some further restrictions on cert errors and the like, but this is the general setup, and I believe a pretty nice feature to increase your site's security.


Also: Jeff and Andy at Paypal are working hard at standardizing this.



Thursday, July 01, 2010

privacy-preserving videos

It's hard to resist the YouTube hotness -- especially if you feel like blogging about the adorable kittens you just saw on YouTube because all your friends must absolutely see it. Sarcasm aside, there are plenty of reasons to embed videos on a blog or web site, but there's something to consider: the embedded video might be used to track your visitors!

Frankly, any content embedded on your site but hosted by another site can cause privacy concerns. When the content is requested, the user's cookies go along with it. The effect is that not only does YouTube know what site a user is on when viewing the kitten video, but they potentially also know who the viewer is! This means that unless you log out of YouTube regularly, they know all the videos you've viewed and on what sites you saw them.

This isn't breaking news. This stuff has been around for a while, it's just Not Obvious.

So you run a blog or web site and you don't want to aid in YouTube tracking your visitors. What do you do? You have three options in my view:

1. Use YouTube's privacy mode. You can embed videos that suppress sending cookies until the visitor has clicked to watch them. This works by changing URLs for any content automatically loaded by your site from the domain "youtube.com" to "youtube-nocookie.com". This way the cookies (for youtube.com) aren't sent until more content is loaded from youtube.com after the click.

2. Don't trust 'em? Use EFF's MyTube. This does something similar, but the static content is loaded from your own site, then when the visitor clicks on the video, it is loaded from youtube.com and the cookies are sent.

3. Open Video FTW! If possible, host the video yourself (on your own website) and use the <video> tag! More info here. Flash is not the only way to display video on the web! Use open standards!

Friday, April 09, 2010

history sniffing fix has landed

David Baron's history sniffing fix has landed in the trunk repository (VCS nerds, click here for details)! This means you can grab one of our nightly builds and try out the fix for yourself -- but be warned, these nightlies aren't always stable, since they're rapidly changing.

While the fix isn't in the final version of Firefox yet, it should be in the next feature revision (3.7, or whatever major comes up next), and is shipping in alpha releases starting with 1.9.3a4. We're hoping to use the incubation time in nightlies, alphas, and probably a beta or two, to make sure the fix works and get feedback from some users. If you are skeptical about our fix or just want to test drive it, grab a nightly build and let me know what you think!

Wednesday, March 31, 2010

turning off the :visited privacy leak

Since I started at Mozilla, I've been trying to increase momentum on fixing the history sniffing privacy leak. I've been able to get lots of people interested, and David Baron has worked hard to come up with a fix. This is a hard problem, and the stars have finally aligned: the Firefox source code, our thinking, research, and a need have come together to get this done.

David has nearly finished an implementation of a plug for the leak, and it's a pretty nice solution that strikes a balance between privacy and utility. In the end, we're going to have to break the web, but only a little bit, and in ways we believe can be recreated with other techniques.

The fix has three parts:
  1. :visited is restricted to color changes. Any size or other types of layout/loading effects are disabled. This is foreground, background, border, SVG outline and stroke colors.
  2. getComputedStyle and similar functions will lie: all links will appear unvisited to the web site, but you'll still see the visitedness when the page is rendered.
  3. The layout code has been restructured to minimize the difference in code path for laying out visited and unvisited links. This should minimize timing attacks (though it can't remove them all).

I don't think web sites should be able to extract your browsing history without your consent; this is one of the bits of the web that rubs me the wrong way, and I'm excited we've made some progress towards removing this from the web. If it rubs you the wrong way too, and you just can't wait for our upcoming fix, you can turn off all visited links in Firefox 3.5 and newer. This breaks the web even more, but is an immediate hack if you want to hide from the sniffers.

Over the last few years, I've been collecting a list of sites that show how this trick can be abused. Hopefully all of them will stop working with the new fix!

More reading:

Friday, January 29, 2010

cookies by many different names

Cookies are great, and everyone loves them (chocolate chip are my favorite) but if we leave the Internet to its own device it could potentially drive itself into a state of udder deception where other technologies are secretly used in place of cookies for tracking and identification purposes.

Spending the past two days submerged in various privacy discussions, I've started again deeply thinking about cookies and tracking. The fundamental privacy concerns about HTTP cookies (and other varieties like Flash LSOs) come from the fact that such a technology gives a web server too much power to connect my browsing dots. Third-party cookies exacerbate this problem -- as do features like DOM storage, google gears, etc.

Come to think of it, cookies aren't unique in their utility as dot-connectors: browsing history can also be used. A clever site can make guesses at a user's browsing history to learn things such as which online bank was recently visited. This is not an intended feature of browsing history, but it came about because such a history exists.

But wait, cookies, Flash LSOs, DOM storage, and browsing history aren't uniquely useful here either! Your browser's data cache can be used like cookies too! Cleverly crafted documents can be injected into your cache and then re-used from the cache to identify you.

In fact, all state data created or manipulated in a web browser by web sites has the potential to be a signal for tracking or other dot-connecting purposes. Even if the state change seems to be write-only there could be other features that open up the other direction (e.g., the CSS history snooping trick mentioned above -- or timing attacks).

Stepping Back and thinking about these dot-connecting "features" in the context of the last couple days' privacy discussions has got me wondering if there's not a way we can better understand client-side state changes in order to holistically address the arbitrary spewing of identifying information. I think the first step towards empowering users to protect themselves better online is to understand what types of data is generated by or transmitted by the browser, and what can be used for connecting the dots. After we figure that out, maybe we can find a way to reflect this to users so they can put their profile on a leash.

But while we want to help users maintain the most privacy possible while browsing, we can't forget that many of these dot-connecting features are incredibly useful and removing them might make the Web much less awesome. I like the Web, I don't want it to suck, but I want my privacy too. Is there a happy equilibrium?

How Useful is the web with cookies, browsing history and plug-ins turned off? Can we find a way to make it work? There are too many questions and not enough answers...