You are on page 1of 4

By Eleftherios Diakomichalis

In this post, we analyze Wikipedia — a site that has achieved tremendous success and scale
through crowd-sourcing human input to create one of the Internet’s greatest public goods.
Wikipedia’s success is particularly impressive considering that the site is owned and operated
by a non-profit organization, and that almost all of its content is contributed by unpaid
volunteers.

The non-commercial, volunteer-driven nature of Wikipedia may cause developers from the
“decentralized web” to question the site’s relevance. However, these differences may be
merely cosmetic: IPFS, for example, has no inherent commercial model, and most of the
open source projects that underlie the decentralized web are built, at least in part, by
volunteers.

We believe that a site that has managed to coordinate so many people to produce such
remarkable content is well worth a look as we search for solutions to similar problems in the
emerging decentralized web.

To better understand Wikipedia’s success, we first survey some key features of Wikipedia’s
battle-tested (to the tune of 120,000 active volunteer editors) coordination mechanisms. Next,
we present some valuable high-level lessons that blockchain projects interested in human
input might learn from Wikipedia’s approach. Finally, we explore vulnerabilities inherent to
Wikipedia’s suite of mechanisms, as well as the defenses it has developed to such attacks.

Wikipedia: key elements


While we cannot hope to cover all of Wikipedia’s functionality in this short post, we start by
outlining a number of Wikipedia’s foundational coordination mechanisms as background for
our analysis.

User and article Talk Pages


While anyone can edit an article anonymously on Wikipedia, most regular editors choose to
register with the organization and gain additional privileges. As such, most editors, and all
articles, have a public metadata page known as a talk page, for public conversations about the
relevant user or article. Talk pages are root-level collaborative infrastructure: they allow
conversations and disputes to happen frequently and publicly.

Don’t like ads? Become a supporter and enjoy The Good Men Project ad free
Since talk pages capture a history of each editor’s interaction — both in terms of
encyclopedia content and conversational exchanges with other editors — they also provide
the basis for Wikipedia’s reputation system.

Clear and accessible rules


If we think of the collection of mechanisms Wikipedia uses to coordinate its editors as a kind
of “social protocol”, the heart of that protocol would surely be its List of Guidelines and List
of Policies, developed and enforced by the community itself. According to the Wikipedia
page on Policies and Guidelines:
“Wikipedia policies and guidelines are developed by the community… Policies are standards
that all users should normally follow, and guidelines are generally meant to be best practices
for following those standards in specific contexts. Policies and guidelines should always be
applied using reason and common sense.”

For many coming from a blockchain background, such policies and guidelines will likely
seem far too informal to be of much use, especially without monetary or legal enforcement.
And yet, the practical reality is that these mechanisms have been remarkably effective at
coordinating Wikipedia’s tens of thousands of volunteer editors over almost two decades,
without having to resort to legal threats or economic incentives for enforcement.

Enforcement: Peer consensus and volunteer authority


Upon hearing that anyone can edit a Wikipedia page, no money is staked, no contracts are
signed, and neither paid police nor smart contracts are available to enforce the guidelines, an
obvious question is: why are the rules actually followed?

Wikipedia’s primary enforcement strategy is peer-based consensus. Editors know that when
peer consensus fails, final authority rests with certain, privileged, volunteer authorities with
long-standing reputations at stake.

Peer consensus
As an example, let’s consider three of the site’s most fundamental content policies, often
referred to together. “Neutral Point of View” (NPOV), “No Original Research” (NOR), and
“Verifiability” (V) evolved to guide editors towards Wikipedia’s mission of an unbiased
encyclopedia.

If I modify the Wikipedia page for Mahatma Gandhi, changing his birthdate to the year 1472,
or offering an ungrounded opinion about his life or work, there is no economic loss or legal
challenge. Instead, because there is a large community of editors who do respect the policies
(even though I do not), my edit will almost certainly be swiftly reverted until I can credibly
argue that my changes meet Wikipedia’s policies and guidelines (“Neutral Point of View”
and “Verifiability”, in this case).

Such discussions typically take place on talk pages, either the editor’s or the article’s, until
consensus amongst editors is achieved. If I insist on maintaining my edits without convincing
my disputants, I risk violating other policies, such as 3RR (explained below), and attracting
the attention of an administrator.

Volunteer authority: Administrators and Bureaucrats


When peer consensus fails, and explicit authority is needed to resolve a dispute, action is
taken by an experienced volunteer editor with a long and positive track record: an
Administrator.

Administrators have a high degree of control over content, include blocking and unblocking
users, editing protected pages, and deleting and undeleting pages. Because there are relatively
few of them (~500 active administrators for English Wikipedia), being an administrator is
quite an honor. Once nominated, adminship is determined through discussion on the user’s
nomination page, not voting, with a volunteer bureaucrat gauging the positivity of comments
at the end of the discussion. In practice, those candidates having more than 75% positive
comments tend to pass.

Bureaucrats are the highest level of volunteer authority in Wikipedia, and are also typically
administrators as well. While administrators have the final say for content decisions,
bureaucrats hold the ultimate responsibility for adding and removing all kinds of user
privileges, including adminship. Like administrators, bureaucrats are determined through
community discussion and consensus. However, they are even rarer: there are currently only
18 for the entire English Wikipedia.

Since there is no hard limit to the number of administrators and bureaucrats, promotion is
truly meritocratic.

Evolving governance
Another notable aspect of Wikipedia’s policies and guidelines is that they can change over
time. And in principle, changing a Wikipedia policy or guideline page is no different than
changing any other page on the site.

The fluidity of the policies and guidelines plays an important role in maintaining editors’
confidence in enforcing the rules. After all, people are much more likely to believe in rules
that they helped create.

Don’t like ads? Become a supporter and enjoy The Good Men Project ad free
If we continue to think of the policies and guidelines for Wikipedia as a kind of protocol, we
would say that the protocol can be amended over time and that the governance for its
evolution takes place in-protocol — that is, as a part of the protocol itself.

Lessons for the decentralized web


Now that we have a little bit of background on Wikipedia’s core mechanisms, we will delve
into the ways that Wikipedia’s approach to coordination differs from similar solutions in
public blockchain protocols. There are three areas where we believe the decentralized web
may have lessons to learn from Wikipedia’s success: cooperative games, reputation, and an
iterative approach to “success”.

We also hope that these lessons may apply to our problem of generating trusted seed sets for
Osrank.

Blockchain should consider cooperative games


Examining Wikipedia with our blockchain hats on, one thing that jumps out right away is that
pretty much all of Wikipedia’s coordination games are cooperative rather than adversarial.
For contrast, consider Proof of Work as it is used by the Bitcoin network. Because running
mining hardware costs money in the form of electricity and because only one node can get
the reward in each block, the game is inherently zero-sum: when I win, I earn a block reward;
every other miner loses money. It is the adversarial nature of such games that leaves us
unsurprised when concerns like selfish mining start to crop up.
As an even better example, consider Token Curated Registries (TCRs). We won’t spend time
describing the mechanics of TCRs here, because we plan to cover the topic in more detail in a
later post. But for now, the important thing to know is that TCRs allow people to place bets,
with real money, on whether or not a given item will be included in a list. The idea is that,
like an efficient market, the result of the betting will converge to produce the correct answer.

One problem with mechanisms like TCRs is that many people have a strong preference
against playing any game in which they have a significant chance of losing — even if they
can expect their gains to make up for their losses over time. In behavioral psychology, this
result is known as loss aversion and has been confirmed in many real-world experiments.

In short, Proof of Work and TCRs are both adversarial mechanisms for resolving conflicts
and coming to consensus. To see how Wikipedia resolves similar conflicts using cooperative
solutions, let’s dive deeper into what dispute resolution looks like on the site.

You might also like