Default preferences: there is no way to avoid decisions

An excellent post on the Mozilla blog from a former colleague of this blogger touches on the problem of default settings in software: to what extent do they represent user intent as opposed to user apathy. It has been something of an accepted wisdom among software developers that few users change default settings that their applications are installed with, and even fewer will venture near options marked “advanced” or similar intimidating adjective intended to scare away users. Until now there was little public data to back up this intuition. The new research finally delivers hard data on this, with a few twists. The established wisdom survives: many of the knobs and levers are indeed untouched. The catchy title for the post “Writing for the 98%” follows from that observation:

Is it worth the engineering effort, UX effort, and screen real estate to make user-visible (to say nothing of discoverable) preferences if fewer than 2% of users benefit?

In the best case scenario, a whopping 10% of users enabled Do-Not-Track which had been turned off by default. Privacy appears to be that rare strong motivator for tweaking software settings: 1.5% disable history completely, 3% clear history on exit and 5% always start the browser in private mode.  A similar rate of activity is observed for security related settings. For example about 5% disable the password manager functionality that remembers credentials. Curiously some infrequent modifications disable existing security checks:  about 1-2% opt out of the safe browsing and malware checks.

At the other extreme are settings controlling the minutiae of security protocols, hidden under the Encryption tab of Advanced settings. 0.02% users have taken the trouble to disable SSL 3– something this blogger also does for IE. About 2% disable OCSP but more curiously 0.03% require OCSP checks. Perhaps the first group are trying to work around slow-down and certificate validations errors caused by failed OCSP checks. (X509Labs tracks OCSP responder performance by certificate authority to keep CAs honest.) The second group are likely to be security users who want revocation errors to hard-fail– eg if OCSP response is not available, treat the certificate as untrusted, instead of merrily carrying on. More surprising is that 1% of users have chosen to automatically send a personal certificate if there is exactly one. This option refers to client certificates, typically used only for high security enterprise/government scenarios where users login to websites with smart cards instead of passwords.

But there is also contradiction lurking here. The data point that only 10% enabled DNT is used as evidence that Mozilla made the right call in defaulting that setting to off:

This is an astonishingly high number of users to enable an HTTP header that broadcasts user intent, but is unable to enforce anything client-side. It is a testament to DNT advocates that adoption is this high, but even though this preference is changed by a large minority of users, Firefox should not enable it by default.

There is a circularity to the argument. All data indicates that users are not messing with browser settings. If Firefox instead shipped with DNT on by default and discovered that only 10% of users disabled it, would that also be an occasion to congratulate the designers on making the right call? (In fact since DNT unlike OCSP does not break anything– yet– it is very likely few users would touch it.) While we know that few users are modifying settings, the reasons for this are unclear. The post touches on some possibilities:

There are at least three plausible, possibly-overlapping interpretations: Firefox predicted the most useful default settings correctly, Firefox is doing a poor job converting user actions into saved preferences, or the population who cares about browser security preferences is really that small.

Another option: users lack meaningful information about the meaning of all these different browser settings, and do not understand what is at stake. It is one thing to have preferences about privacy and online tracking at high level. It is another to connect the dots between those intuitions to “third-party cookie blocking” or DNT. In general we can not equate absence of decision as endorsing the status quo. Did these users actually visit the Privacy tab and verify that DNT settings were configured as expected?

The situation is similar to the discrepancy in organ donation rates between Germany ( low) and Austria ( high)  The problem is not that one society is less altruistic or espouses different views on death. It turns out in Germany the preference is opt-in, while Austria goes with opt-out. In both cases few users are going out of their way to change that default. If public officials in these countries followed the Mozilla logic, they could all pat themselves on the back for having correctly predicted public sentiment, when in fact they were shaping it.

This underscores the uncomfortable place software vendors find themselves in. In an ideal world there is complete separation between policy and mechanism. Users decide on the policy for how their system behaves, the software developer provides the means for expressing that policy. But in reality any realistic application contains hundreds of policy decisions. Even the most flexible application that permits users to tweak each and every settings starts out in some reasonable default configuration, to serve as starting point. Otherwise it would be nearly impossible for users to configure a system from scratch. This is why the Mozilla argument rings hollow:

Frankly, it becomes meaningless if we enable it by default for all our users.  Do Not Track is intended to express an individual’s choice, or preference, to not be tracked.  It’s important that the signal represents a choice made by the person behind the keyboard and not the software maker, […]

Not enabling the setting and leaving users subject to tracking is itself a decision by the software maker. Forcing the question with the end-user is the only way to guarantee their intent is honoured accurately instead of second-guessed. Such in-your-face decision points are rare in mainstream software, because they are considered a distracting user experience.  Two well-known examples are browser choice page forced upon Windows 7/8 thanks to EU consent decree and the now deprecated Google Toolbar installer asking about PageRank.  These are the exceptions proving the rule. For all but the simplest software, striving for value-neutral design is an aspirational goal, never quite realized in practice. Accepting that limitation is the first step to recognizing our own biases/preferences/interests as designers, and asking how well users are being served by the same choices.


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s