[continued from part I]
Two challenges confront a paranoid user trying to decide if the Tor browser-bundle they just downloaded is in fact the authentic version or malware masquerading as a privacy-enhancing tool.
Key management by cliques
First there is the theoretical problem of key-distribution. Verifying a signature requires knowing the public-key of the person who generated the signature. The simplistic idea that “signed” equals trustworthy proves not to work, as many people discovered much to their surprise when perfectly valid signatures were found on run-of-the-mill malware as well as sophisticated nation-sponsored attacks such Stuxnet. It matters who signed the code.
Authenticode uses a hierarchical trust model based on PKI, which is the same model used for SSL certificates. Individuals or companies obtain digital certificates from certificate authorities. The certificate contains information about the person/entity it was issued to, such as their name or DNS domain, as well as a the public-key. It is effectively a statement from the CA that the public-key in question belongs to the person/entity named there. Because the certificate is signed by the issuing CA, it can be verified by anyone in possession of the CA public-key. In effect this amplifies trust; by trusting the public-key of 1 CA users can establish trust in the public-keys of any one else who obtain certificates from that CA. (Assuming they have confidence in the vetting process used before the CA is willing to vouch for the public-key.)
This model scales very well in relation to the number of CAs in existence. Case in point: web-browsers are preloaded with several dozen root CAs (most of which are unused, incidentally) but this small group of “trust anchors” allow verifying SSL certificates for millions of websites. Flip side of that high leverage is a weakness: any one of them can undermine trust by deliberately or mistakenly issuing a certificate to the wrong party. Effective security of the system is determined by the least competent/most dishonest CA.
By contrast, PGP uses a web-of-trust model without centralized parties tasked with vouching for everyone else’s keys. Users individually exchange keys. They can also sign each others’ keys to serve as “introductions” for other contacts in the social graph. Such a distributed model is not susceptible to the weakest-link-in-the-chain problem that plagues X509 where a lot of power is concentrated in an oligarchy of CAs. When users are tasked with managing trust in public-keys one person at a time, the failure of some unrelated third-party will not lead to a catastrophic case of mistaken identity across the network.
Main downside is scaling globally. In order to verify signatures, users need the public-key of the person who created the signature. This is a challenge to say the least. In the standard PGP model, keys are obtained by following social links. Returning to our example of verifying the Tor binaries, users would ideally have a friend or friend-of-a-friend connected to the developer who created the signature. The Tor project conveniently has a page listing keys— ironically using SSL and centralized PKI model to bootstrap trust– but that page would have been blocked in our hypothetical scenario, along with the entire Tor website. There are centralized collections of keys such the MIT PGP key server, but they serve a very different purpose: they act as a directory for looking-up keys rather than a trusted third-party vouching for their integrity. Anyone can submit keys, and in fact bogus keys are submitted routinely. (It does not help that the key server runs over HTTP, allowing standard man-in-the-middle attacks to return bogus keys consistent with forged signature for a binary modified by the attacker.)
Usability, or why Johnny can’t verify signatures
Second is a far more practical problem of usability. Authenticode support is built into the operating system, with automatic signature verification before attempting to install software downloaded from the web:
Verifying PGP signatures is not built into an operating system in the same way; the user is on their own. Getting PGP-compatible software is the first order of business. It is not part of Windows or OS X by default, but is commonly found on default installations of popular Linux distributions such as Ubuntu. Since desktop Linux has negligible market share, the effective result is that most users are being expected to go out of their way to install random software they have likely never used before (or for that matter, will ever use again) only to verify the authenticity of another piece of software they are interested in using immediately. Suspending disbelief that motivation exists, the next challenge is using the unfamiliar software for signature verification. While there are GUI front-ends for integrating PGP functionality with popular email clients– which is after all the primary use-case, encrypting and signing email messages– there is no good option for being able to verify detached signatures on random binaries. Users are expected to drop down to the command-line and type something along the lines of:
$ gpg --verify torbrowser-install-3.6.1_en-US.exe.asc torbrowser-install-3.6.1_en-US.exe
Assuming our determined user has obtained the correct PGP key and marked it as trusted, they will be greeted with this happy news:
gpg: Signature made Tue May 6 16:36:57 2014 PDT using RSA key ID 63FEE659 gpg: Good signature from "Erinn Clark <firstname.lastname@example.org>" gpg: aka "Erinn Clark <email@example.com>" gpg: aka "Erinn Clark <firstname.lastname@example.org>" gpg: WARNING: This key is not certified with a trusted signature! gpg: There is no indication that the signature belongs to the owner. Primary key fingerprint: 8738 A680 B84B 3031 A630 F2DB 416F 0610 63FE E659
(Incidentally that ominous sounding warning about key not being certified is expected; it does not indicate a problem with the signature.)