(Trying to write about the conference before the recollections fade.)
Dan Kaminsky was scheduled to be the invited speaker on Wednesday morning , tentatively titled “On breaking stuff” but he was held up by consulting work at IOActive. Fortunately for the conference program committee, Paul van Oorschot volunteered to give a talk on short-notice and the result was the highly engaging “Security and usability: mind the gap” presentation.
He first started with some anecdotal evidence on the sad state of affairs in what should have been the poster child for usable security: online banking. One of the largest banks in Canada promised to refund 100% of losses resulting from unauthorized transactions– provided the user lived up to their side of the agreement. This fine-print in the customer agreement (granted nobody pays attention to that) makes for entertaining reading:
- Select unique and not easy to guess password– and user will judge the quality of their password how? Windows Live ID has a password quality meter but this is far from being a standard feature.
- Sign-out, logoff, disconnect and close the browser when done (What is the difference between first two? Disconnect means yank the network cable?)
- Implement [sic] firewalls, a browser with 128-bit encryption and virus-scanning. As van Oorschot pointed out, the bank probably means “deploy” rather than “implement”– otherwise they would drastically narrow the potential customer base to developers with copious spare time for writing code from scratch for commodity purposes.
It only gets worse from there. The general pattern is promises of security and reassurance that damages will be covered in exchange for vague expectations of “secure behavior” form users who are often not in a position to accurately judge risks associated with their use of technology. Case in point: one study on malware found that 95% of users had heard of the word “spyware” and 70% banked online– yet some assumed that spyware was a good thing– 45% did not look at URLs and 35% could not explain what HTTPS meant. The status quo for online banking is not an isolated incident, as other case studies drawn from two recent publications van Oorschot coauthored:
- An evaluation of Tor/Vidalia/Privoxy for anonymous browsing, which concluded that Tor is not ready from prime-time use by a novice even with the supposedly user-friendly Vidalia UI. (Given its remarkably low bandwidth and high latency reminiscent of the early “world-wide-wait” days of dial up, you have to wonder if a usability study was necessary to reach that conclusion.)
- Usability study of two password managers with 26 non-technical users that found several problems, including situations where users falsely concluded a security feature was functioning when it was not– the very dangerous “false success” scenario. [Full disclousure: this blogger had reviewed and broke an earlier version of one, PwdHash.]
If poor usability is a security vulnerability as much as a flaw in a cryptographic protocol, what is the prescription? This is where the information security community is now wrestling with its collective conscience. van Oorschot made the frank observation that usability and HCI issues are routinely looked down upon by CS culture, not included in the traditional curriculum because they are easy/trivial and better left to “people who can’t write code” to sort out. He raised the possibility that we had it wrong all the time: cryptography is the easy bit, secure system implementation is far more challenging and the hardest task is building usable secure systems.