To explain why Bitcoin governance appears dysfunctional, it helps to explain why changing Bitcoin is very difficult to begin with- any group or organization tasked with this responsibility is facing very long odds. Comparisons to two others large-scale system are instructive.
Case study: the web
The world-wide web is a highly decentralized system with millions of servers providing content to billions of clients using HTML and HTTP. Software powering those clients and servers is highly diverse. MSFT, Mozilla and Google are duking out for web-browser market share (Apple, if one includes iPhone), while on the server side nginx and Apache are the top predators. Meanwhile there is a large collection of software for authoring and designing web-pages, with the expectation that they will be rendered by those web browsers. Mostly for historical reasons the ownership of standards is divided up between IETF for the transport protocol HTTP and W3C for the presentation layer of HTML. But neither organization has any influence over the market or power to compel participants in the system to implement its standards. IETF specifications even carry the tentative, modest designation of “RFC” or “request for comments.” There is no punishment for failing to render HTML properly (if there were, Internet Explorer would have been banished eons ago) or for that matter introducing proprietary extensions.
That seems like a recipe for getting stuck, with no way to nudge the entire ecosystem to adopt improved versions of the protocol, or prevent fragmentation where every vendor introduces their own slightly incompatible variant in search of competitive advantage. (As an afterthought, they might even ask the standards body to sanction these extensions in the next version of the “standard”) But in reality the web has shown a remarkable plasticity in evolving and adapting new functionality. From XmlHttpRequest which fueled the AJAX paradigm for designing responsive websites to security features like Content Security Policy or improved versions of TLS protocol, web browsers and servers continue to add new functionality. Two salient properties of the system help:
- The system is highly tolerant of change and experimentation. Web browsers ignore unknown HTTP headers in the server response. Similarly they ignore unknown HTML tags and attempt to render the page as best as they can. If someone introduces a new tag or HTTP header that is recognized by only one browser, it will not break the rest of the web. Some UI element may be missing and some functionality may be reduced, but these are usually not critical failures. (Better yet, it is easy to target based on audience capability, serving one version of the page to new browsers and a different one with to legacy versions.) That means it is not necessary to get buy-in from every single user before rolling out a new browser feature. The feature may get traction as websites adopt it, putting competitive pressure on other browsers to support it and eventual standardization. That was the path for X-Frame-Options and similar security headers originally introduced by IE. Or it may crater and remain yet another cautionary tale about attempts to foist proprietary vendor crud on the rest of the web- as was the case with most other MSFT “extensions” to HTML including VBScript, ActiveX controls and behaviors.
- There is competition among different implementations. (This was not always the case; MSFT Internet Explorer enjoyed a virtual monopoly in the early 2000s, which not coincidentally was a period of stagnation in the development of the web.)
- There exists a standardization process for formalizing changes and this process has credible claim to impartiality. While software vendors participate in work carried out by these groups, no single vendor exercises unilateral control over the direction of standards. (At least that is the theory- hijacking of standards group to lend the imprimatur of W3C or IETF on what is effectively a finished product already implemented by one vendor is not uncommon.)
The challenge of changing Bitcoin
Bitcoin is the exact opposite of the web.
Intolerant of experimentation
Because money is at stake, all nodes on the network have to agree on what is a valid transaction. There needs to be consensus about the state of the blockchain ledger at all times. Consensus can drift temporarily when multiple miners come up with new blocks at the same time, and it is unclear at first which will emerge as the “final” word on the ledger. But the system is designed to eliminate such disagreements quickly and have everyone converge on a single winning chain. What it can not tolerate is a situation where nodes permanently disagree about which block is valid because there is some new feature only recognized by some fraction of nodes. That makes it tricky to introduce new functionality without playing a game of chicken with upgrade deadlines.
There is a notion of soft-forks for introducing new features which “only” requires a majority of nodes to upgrade as opposed to everyone. These are situations where the change happens to be backwards compatible in the sense that a nodes that does not upgrade will not reject valid transactions using the new feature. But it may incorrectly accept bogus transactions, because it is not aware of additional criteria implied by that feature. Counter-intuitive as that sounds, this approach works because individual nodes only accepts transactions when they are confirmed by getting included by miners in the blockchain. As long as the majority of miners have upgraded to enforce new rules, bogus transactions will not make it into the ledger. This soft-fork approach has been flexible enough to implement a surprising number of improvements, including segregated-witness most recently. But there are limits: expanding blocksize limit can not be done this way because nodes would outright reject blocks exceeding the hardcoded limit even if miners mint them. That would require a hard-fork, which is the disruptive model where everyone must upgrade by a particular deadline. Those who fail face the danger of splitting off into a parallel universe where transactions move funds in ways that are not reflected in the “real” ledger recognized by everyone else.
No diversity in implementations
Until recently, virtually all Bitcoin peers were running a single piece of software (Bitcoin Core) maintained by the aptly named core team. Even today that version retains over 80% market share while its closest competitors are forks that are identical in all but one feature, namely the contentious question of how to increase maximum blocksize.
No impartial standard group
The closest to an organization maintaining a “specification” and deciding which Bitcoin improvement-proposal or BIPS gets implemented is the core team itself. It’s as if W3C is not only laying down the rules of HTML, but also shipping the web-browser and the web-server used by everyone in the world. Yet for all that power, that group still has no mandate or authority to compel software upgrades. It can release new updates to the official client with new features, but it remains up to miners and individual nodes to incorporate that release.
Between a rock and a hard-fork
This leaves Bitcoin stuck in its current equilibrium. Without the flexibility of experimenting with new features in a local manner by anyone with a good idea, all protocol improvements must be coordinated by a centralized group- the ultimate irony for a decentralized system. That group is vested with significant power. In the absence of any articulated principles around priorities- keeping the system decentralized, keeping miners happy, enabling greater Bitcoin adoption etc.- all of its decisions will be subject to second-guessing, met with skepticism or accusations of bias. Without the mandate or authority to compel upgrades across the network, even a well-meaning centralized planning committee will find it difficult to make drastic improvements or radical changes, lest the system descend into chaos with a dreaded hard-fork. Without the appetite to risk hard-forks, every improvement must be painstakingly packaged as a soft-fork, stacking the deck against timely interventions when the network is reaching a breaking point.