Wikipedia

Authored by: Karen Frost-Arnold

The Routledge Handbook of Applied Epistemology

Print publication date:  August  2018
Online publication date:  September  2018

Print ISBN: 9781138932654
eBook ISBN: 9781315679099
Adobe ISBN:

10.4324/9781315679099-3

 

Abstract

This chapter summarizes the Wikipedia debates within applied epistemology and argues that the social organization of the Wikipedia community shapes its epistemic merits and limitations. First, veritistic systems-oriented social epistemology is used to evaluate Wikipedia’s impact on the formation and dissemination of true beliefs within its community of users. Second, the chapter draws on the social epistemology of trust to argue that Wikipedia’s epistemic status depends on its complex and shifting relations of trust and distrust. Finally, this chapter applies feminist epistemology to show that lack of diversity in Wikipedia’s community is a pressing epistemic threat.

 Add to shortlist  Cite

Wikipedia

Introduction

As a free encyclopedia produced mostly by volunteer contributors, Wikipedia is an ambitious experiment in online collaborative knowledge dissemination. It is one of the most visited sites on the Internet. It receives 374 million unique visitors every month, and regularly appears as the top Google result to factual queries (“Wikipedia:About”). There are 280 active Wikipedias in different languages (“List of Wikipedias”). The English language Wikipedia is the largest and is the focus of this chapter. English Wikipedia (hereafter ‘Wikipedia’) has over 5 million articles (“Wikipedia:About”) and over 30,000 active Wikipedians editing content (“Active Wikipedians”). As one of the most commonly used reference sites, Wikipedia certainly merits applied epistemologists’ attention.

Wikipedia has lofty epistemic goals and is designed and constantly updated with epistemic considerations in mind. The Wikipedian dream is an army of volunteers drawing on reliable, independent sources to provide free access to an accurate, neutral source of encyclopedic knowledge. Anyone can edit most pages and help shape the policies and organizations that maintain Wikipedia’s community. This openness aims at adding content rapidly. Additionally, in place of slower editorial oversight or expert peer review procedures, Wikipedia relies on its contributors to check that articles meet its standards of a neutral point of view, verifiability, and no original research (“Wikipedia:Core content policies”).

But does the reality live up to Wikipedians’ dreams? As a normative enterprise, applied epistemology is uniquely positioned to evaluate to what extent Wikipedia actually improves or damages the landscape of human knowledge. This chapter summarizes the Wikipedia debates within applied epistemology and argues that the social organization of the Wikipedia community shapes its epistemic merits and limitations. A useful framework for the epistemology of Wikipedia is veritistic systems-oriented social epistemology (see Fallis 2011). Systems-oriented social epistemology evaluates epistemic systems according to their epistemic outcomes for community members (Goldman 2011). A veritistic social epistemology takes true belief as the fundamental epistemic good (Goldman 1992, 1999). Thus, a veritistic systems-oriented social epistemology of Wikipedia evaluates Wikipedia’s impact on the formation and dissemination of true beliefs within its community of users. Alvin Goldman (1992) lays out five veritistic standards, summarized by Paul Thagard as follows:

  1. the reliability of a practice is measured by the ratio of truths to total number of beliefs fostered by the practice;
  2. the power of a practice is measured by its ability to help cognizers find true answers to the questions that interest them;
  3. the fecundity of a practice is its ability to lead to large numbers of true beliefs for many practitioners;
  4. the speed of a practice is how quickly it leads to true answers;
  5. the efficiency of a practice is how well it limits the cost of getting true answers (Thagard 1997: 247).

Section 1 evaluates Wikipedia according to standard 1, while section 2 focuses on standards 2–5. Section 3 draws on the social epistemology of trust to argue that Wikipedia’s epistemic status depends on its complex and shifting relations of trust and distrust between community members. Finally, section 4 applies feminist epistemology to show that lack of diversity in Wikipedia’s community is a pressing epistemic threat, especially to standards 1 and 2.

1  Is Wikipedia reliable? Should we trust it?

Much epistemic analysis of Wikipedia focuses on the question of Wikipedia’s reliability: is most of the information in Wikipedia accurate (see Fallis 2011: 299)? Work on the price system, prediction markets, and the Condorcet jury theorem suggests that aggregating the viewpoints of many people can produce reliable knowledge, under certain conditions (Estlund 1994; Hayek 1945; Sunstein 2006; Surowiecki 2004). However, Don Fallis (2008) argues that this does not adequately explain Wikipedia’s reliability. Wikipedians do not simply aggregate their views (e.g., they often reach consensus through deliberation rather than voting); any current article on Wikipedia reflects the viewpoint of the most recent editor (rather than an aggregated judgment), and often articles are created by a small group instead of a large crowd (Fallis 2008: 1670).

Whether or not Wikipedia illustrates the wisdom of crowds, studies show that it is relatively reliable (Fallis 2008; “Reliability of Wikipedia”). For example, a widely read Nature study found that Wikipedia’s reliability compares well with Encyclopedia Britannica’s (Giles 2005). 1 Heilman et al. (2011) review several studies showing that Wikipedia’s medical information is generally reliable, and more recent studies concur (Kräenbring et al. 2014; Kupferberg and Protus 2011; Temple and Fraser 2014). Despite these encouraging results, applied epistemologists’ concerns about reliability center around three features of the encyclopedia: non-expert authorship, anonymous editing, and incentives to damage Wikipedia.

Anyone can edit most pages in Wikipedia, which means that editors (i.e., those who write, edit, remove, and categorize information on Wikipedia) are often not experts on the topic. Wikipedia is not a source of new knowledge production that requires content expertise. Instead, Wikipedia aims to be an army of people collecting and disseminating information based on reliable independent sources. Experienced Wikipedians may be “informational experts” who possess procedural skill in managing information (such as finding and inserting citations in Wikipedia articles) (Hartelius 2011; Pfister 2011). Nonetheless, one might worry that Wikipedians who lack content expertise will not know which topics are notable and important, which sources are reliable, which claims have been refuted, etc. (Fallis 2011: 300). Additionally, many content experts have been alarmed at non-experts editing or deleting their work on Wikipedia (Healy 2007). When this happens, the reliability of Wikipedia can suffer, particularly if accurate claims by content experts are removed or replaced with inaccurate ones by non-experts. While Wikipedia does not grant content experts additional authority (in part, because it has no mechanism for verifying the expertise of editors (“Wikipedia:Expert editors”)), Wikipedians have made attempts to reach out to experts by soliciting their input in ways that do not require experts to learn how to navigate the wiki infrastructure (Lih 2015; Lih et al. 2015). Such efforts, if successful, could improve the reliability of the encyclopedia.

Wikipedia allows editors to be anonymous. Anonymity is a second source of concern for epistemologists who believe that holding people’s offline identities accountable for errors will increase the accuracy of their contributions (Sanger 2009; Wray 2009). Wikipedians can create accounts with a pseudonym, or they can edit with only their IP address being logged. Thus, an editor can add inaccurate information, and their offline identity will not be tarnished. Since editors’ offline identities cannot be punished, some worry that editors lack incentive for living up to expectations of accuracy. Larry Sanger, a co-founder of Wikipedia, left the project, in part, because of such concerns: “anonymity allows those with an anti-intellectual or just crotchety bent to attack experts without a restraint that many would no doubt feel if their real names were known” (Sanger 2009: 66). And K. Brad Wray argues that the epistemic culture of Wikipedia is inferior to that of science precisely because scientists have a reputation to protect, while anonymous Wikipedians do not (Wray 2009: 39–40). While these arguments have merit, they ignore two points. First, there are punishments for untrustworthy behavior on Wikipedia. Although there is rarely punishment for users’ offline identities, there is a system for sanctioning users’ online identities by, for example, banning an account from the site. Thus, there is some accountability for inaccuracy. 2 I discuss these sanctions in section 3. Wikipedians, though, are generally very sensitive to privacy issues and rarely publicly release the offline identity of bad actors (Owens 2013). Second, demands for increased accountability ignore the epistemic value of online anonymity. Members of vulnerable populations who have legitimate fears of reprisal and harassment might only contribute their expertise to Wikipedia under the protection of anonymity (Fallis 2008: 1668; Frost-Arnold 2014a). In terms of reliability, Wikipedia’s anonymity may allow some vulnerable Wikipedians to engage in talk and discussions that help weed out errors, thereby improving the accuracy of the encyclopedia.

A third epistemic concern is that Wikipedia’s openness attracts editors with incentives to harm the epistemic project (Fallis 2011: 300). Trolls, vandals, public relations firms, politicians, and harassers are all potential threats to reliability, since it may be in their interest to add false content to articles. For example, editing while under a conflict of interest, especially undisclosed paid editing, is frowned upon in Wikipedia (“Wikipedia:Conflict of interest”). In August 2015, administrators blocked 381 user accounts for charging money to post articles (Erhart and Barbara 2015). In this scam, small businesses and artists were contacted by someone posing as an experienced Wikipedian or administrator. The scammers used records of previously rejected articles. They contacted the subjects of those articles to offer to add the content once payment had been received (“Wikipedia:Long-term abuse/Orangemoody”). This and other cases of paid editing give critics cause to worry about its reliability. However, one might be comforted by the fact that Wikipedia’s volunteer army did uncover the scam and take corrective measures (Erhart and Barbara 2015). Additionally, Wikipedia has worked with major PR firms to create a professional ethics culture in which PR firms do not violate Wikipedia policies, including those related to conflict of interest (Lih 2015; “Wikipedia:Statement on Wikipedia from participating communications firms”).

Wikipedia’s openness also attracts trolls and vandals who introduce inaccurate content for their own amusement. In the widely reported Seigenthaler incident, a prankster edited the biography of journalist John Seigenthaler to speculate that Seigenthaler had been involved in the assassinations of John and Robert Kennedy. Seigenthaler publicly attacked Wikipedia for failing to detect the error quickly and for not identifying the vandal (Seigenthaler 2005). In response, Wikipedia instituted new measures to protect the encyclopedia against vandalism: (1) anonymous users were prevented from creating new articles, (2) a Biography of Living Persons policy was created to provide guidelines for removing questionable material in biographies, and (3) a semi-protection tool was introduced, which prevents unregistered or newly registered users from editing articles that may be targets of vandalism (Lih 2009: 191–94). While these measures cannot make Wikipedia vandal-free, they can increase its reliability. In sum, there is some empirical support for Wikipedia’s reliability, and it may be reliable enough to meet our epistemic goals, especially when the stakes for error are not high (Fallis 2008: 8). While applied epistemologists have raised concerns about its lack of experts, anonymity, and openness to those with harmful goals, Wikipedia has policies in place to address these issues and continually improve its reliability.

Leaving aside whether Wikipedia is in fact reliable, a further question of epistemological importance is whether we can be justified in believing it reliable, or as P. D. Magnus asks: “Can we trust it?” (Fallis 2008, 2011; Magnus 2009). If we can justifiably trust Wikipedia, our justification will have to be different than justification for trusting traditional testimony, since Wikipedia has no single, identifiable author (Magnus 2009; Simon 2010; Tollefsen 2009; Wray 2009). 3 Judith Simon (2010) argues that trust in Wikipedia is best viewed as an instance of procedural trust – trust in the process which generates Wikipedia content. One might take a user’s knowledge about Wikipedia’s safeguards for reliability as good justification for their trust in the process. However, a problem with procedural trust in Wikipedia is that Wikipedia is dynamic – anyone can change the content at any moment. Thus, while we may have good reason to believe that Wikipedia as a system is reliable on average, “this overall trustworthiness does not help us to assess the trustworthiness of a specific claim in Wikipedia” (Simon 2010: 349).

So, are there any tools to assess the trustworthiness of specific claims? Magnus argues that Wikipedia frustrates our usual methods of evaluating the trustworthiness of online claims. For example, one might use the method of sampling by checking a claim on a website against other sources for corroboration. However, many other sites copy material from Wikipedia, so one may inadvertently use Wikipedia as a self-confirming source (Magnus 2009: 87). Relatedly, Wikipedia has been a locus of ‘citogenesis’: “the creation of ‘reliable’ sources through circular reporting” (“Wikipedia:List of citogenesis incidents”). In citogenesis, information is added to Wikipedia, which is then picked up in media reports, which are in turn used as ‘independent’ sources to support the original information in Wikipedia. Another way Wikipedia frustrates our usual tools of verification is through multi-authored content. I might evaluate the reliability of a blog post about physics by a self-professed physicist using standards of plausibility of style (Does the author write like a physicist?) or calibration (Suppose I know something about physics. Is the author correct about the physics claims that I know? If so, then the author is likely to be accurate about physics outside my area of knowledge). But a Wikipedia article is potentially written by hundreds of people who do as little as add one claim or tidy up the spelling and style. Thus, the fact that the article is written in the style of a physicist is no guarantee that any one claim was written by an expert in physics; and since each claim could be written by a different author, accuracy in one claim is no guarantee that other claims were written by the same knowledgeable author (Magnus 2009: 85–87). So, Wikipedia may be challenging to verify, and trust in its claims may be hard to justify.

However, Fallis (2011) and Simon (2010) argue that there may be other ways to verify Wikipedia’s content. Wikipedia grants access to every article’s editing history (showing readers whether the article has been the subject of a recent edit war – a red flag for bias), and it provides editing history for every editor (letting readers verify whether a claim was made by an editor with a history of vandalism). 4 The ‘talk’ pages allow readers to see critical discussion about the reliability of content. And dispute templates are posted in many Wikipedia articles warning readers that, for example, “The truthfulness of this article has been questioned.” Thus, readers who take the time to use these tools can put themselves in a position to gain defeaters that would undermine trust in Wikipedia content. In sum, determining whether to trust Wikipedia’s claims may be difficult using traditional methods of verification, but other methods are available.

2  A broader epistemology of Wikipedia

Reliability is not the only epistemic virtue, and applied epistemologists have analyzed Wikipedia’s success according to Goldman’s other standards of power, speed, fecundity, and efficiency. While error-avoiding epistemologists take reliability as the primary epistemic virtue, truth-seeking veritists take power to sometimes trump reliability (Coady 2012: 5–7; Fallis 2006: 182–83; Frost-Arnold 2014a: 66–68). An encyclopedia with only one entry consisting of all true claims (e.g., about Harriet Tubman) would be a maximally reliable source, but it would not be very powerful or encyclopedic (Coady 2012: 170). We want an encyclopedia to help its users to know many truths, as well as avoid errors. And an encyclopedia that helps users attain more true beliefs than its competitor would have an epistemic advantage, along one dimension. On this score, Wikipedia is incredibly epistemically successful. Encyclopedia Britannica announced it was ending print production in 2012; at the time, Britannica had 65,000 articles compared to Wikipedia’s 3,890,000, almost sixty times as many (Silverman 2012). While Wikipedia’s openness appeared to be a concern for its reliability, it is a boon to its power – thousands of volunteer non-experts can create and expand more entries than an encyclopedia written by a smaller number of experts (Fallis 2011: 305).

Wikipedia also measures well against Goldman’s standards of speed, fecundity and efficiency (Fallis 2011: 305). In the era of smartphones, Wikipedia provides instant, convenient answers to questions for many people. Additionally, Wikipedia’s openness is explicitly designed to allow for fast content creation (Fallis 2008: 1669; Sunstein 2006: 150). An initial attempt at an online encyclopedia by Jimmy Wales and Larry Sanger, Nupedia, had such a demanding peer review process that it only produced about two dozen articles in its first year (Lih 2009: 40–41). In comparison, Wikipedia produced 20,000 in its opening year (Lih 2009: 77). Many Wikipedians are open access advocates, and the project’s emphasis on free online content increases its fecundity – it allows true beliefs to be acquired by many people. Similarly, Wikipedia is efficient as a free source of knowledge for users, and its online platform and use of volunteer editors has cost-saving benefits in production.

3  Wikipedia and trust

The epistemic culture of Wikipedia is central to its epistemic successes and failings (cf. Coady 2012: 171). While most readers perceive Wikipedia as a static piece of text, Wikipedia is actually a dynamic community. Wikipedians interact on talk pages, meet in person, collaborate on Wiki Projects, organize edit-a-thons, and hold conferences. This section analyzes one key feature of Wikipedia’s culture: its complex and shifting relations of trust and distrust.

Wikipedia started with a small community, and personal relations of trust between people who knew each other were central in founding the community (Lih 2009). Paul de Laat (2010) shows that as new members were recruited, Wikipedia used hopeful trust, a form of trust with interesting epistemic significance, to motivate trustworthiness in newbies. On Victoria McGeer’s (2008) account of hopeful trust, the trustor makes herself vulnerable by putting herself in the hands of the trustee, but her hope that the trustee will not take advantage of that vulnerability encourages the trustee to live up to the trust. Thus, hopeful trust can inspire people to be more trustworthy. It does this because the trustor’s hopeful vision can be motivating – it prompts the trustee to think: “I want to be as she sees me to be” (McGeer 2008: 249). In this way, a hopeful vision can make the trustee a kind of role model to herself. Hopeful trust is epistemically interesting because one’s own trust in someone can count, under the right conditions, as evidence that they will be trustworthy.

In their relations with one another, Wikipedians often endorse a powerful form of hopeful trust (de Laat 2010). First, members of the Wikipedia community make themselves vulnerable – they put themselves in each other’s hands. Wikipedia maintains an openness that makes it vulnerable to those with harmful motives. In the face of this threat, Wikipedians could adopt a default attitude of distrust to new members, forcing them to get permission to edit after successfully completing screening procedures. But Wikipedians instead adopt an attitude of qualified trust toward newcomers. One reason Wikipedians often give for this approach is that it inspires people to do better. As founder Jimmy Wales puts it,

[B]y having complex permission models, you make it very hard for people to spontaneously do good … There are so many hostile communities on the Internet. One of the reasons is because this philosophy of trying to make sure that no one can hurt anyone else actually eliminates all the opportunities for trust … [Wikipedia is] about leaving things open-ended, it’s about trusting people, it’s about encouraging people to do good.

Wales 2009: xvii–xviii

A second sign of Wikipedia’s attitude of hopeful trust is its commitment to the principles of “Assume good faith,” “Don’t bite the newbies,” and “Be bold” (“Wikipedia:Assume good faith”; “Wikipedia:Be bold”; “Wikipedia:Please do not bite the newcomers”). With these principles, Wikipedians encourage new editors to make edits without fear of breaking the encyclopedia. In other words, Wikipedians make themselves vulnerable to damage, and they do so with hope by assuming good faith (de Laat 2010: 332). Of course, Wikipedians are a diverse community, and disputes about whether members are actually following these practices abound, but the community consensus guidelines espouse this attitude of hopeful trust. Third, Wikipedia’s most public ambassador, Jimmy Wales, presents a hopeful vision of the community in his public discussions. For example, in a CNN discussion with Seigenthaler, Wales said, “Generally we find most people out there on the Internet are good. … It’s one of the wonderful humanitarian discoveries in Wikipedia, that most people only want to help us and build this free nonprofit, charitable resource” (Wales 2005). 5 In sum, Wikipedians make themselves vulnerable to harm, but they often do so with an attitude of hopeful trust in fellow members and newcomers. Wikipedia communications about community norms hold out a vision to its members of trustworthy commitment to the production of a free, reliable encyclopedia produced by good faith volunteers. If McGeer is right that hopeful trust can inspire trustworthiness, this may help explain some of Wikipedia’s epistemic success – its attitude of hopeful trust and vision of epistemic trustworthiness motivates users to work for the epistemic good of the community.

On the other hand, Wikipedia’s attitude of hopeful trust is also qualified and balanced with a distrustful attitude and rational choice approach to trust. 6 Wikipedia started as a small community where common forms of interpersonal trust could have force. If I know some other Wikipedians, and I care what they think of me, then it is more likely that I can find their vision of me as a fellow Wikipedian motivating. But as Wikipedia grew, it was bound to attract large numbers of users who had only fleeting interactions with the community. Therefore, it is not surprising that Wikipedia was rocked by scandals of bad actors and consequently grew an attitude of distrust to qualify the attitude of hopeful trust (see de Laat 2010). As discussed earlier, in response to the Seigenthaler incident, Wikipedia instituted methods to improve its reliability: (1) no article creation for anonymous users, (2) a Biography of Living Persons policy, and (3) semi-protection. Moreover, software tools and autonomous bots have proliferated to help Wikipedians detect and revert vandalism (de Laat 2015). Many of these mechanisms stem from a mistrust of certain types of editors, for example, anonymous, new editors.

Additionally, as Wikipedia grew, it was no longer feasible for Jimmy Wales to act as a mediator and judge in disputes. So, in 2004, Wales recruited volunteers to set up a mediation committee and an arbitration committee (ArbCom) (Lih 2009: 180). While the mediation committee has been used less, ArbCom is busy – it is the main sanctions arm of Wikipedia, delivering punishments such as probation, restrictions on topics a Wikipedian can edit, and bans from the community (“Wikipedia:Editing restrictions”). Systems of punishment are signs of a rational choice approach to trust. This approach models individuals as self-interested rational actors and maintains that people act in a trustworthy manner when there are sufficient external constraints, most notably punishment for untrustworthiness, to make it in their self-interest to act as expected. Wikipedia has guidelines for conduct and a system of punishment for those who violate the rules. The goal of such systems is to allow members to cooperate with each other without needing to assume that their community members share altruistic or pro-social motivations. Following Hobbes, the rational choice approach thus takes a bleaker view of human nature than the hopeful trust approach. Internet historian Jason Scott summarizes the contrast as follows: “Wikipedia holds up the dark mirror of what humanity is, to itself” (quoted in Lih 2009: 131). Because of all the vandalism, trolls, and harassment on Wikipedia, one might be persuaded to adopt this pessimistic view. Whether Wikipedia’s system of punishment is sufficient to keep bad actors at bay is an open question, constantly debated within the community. 7

It may be that Wikipedia’s success lies in its ability to blend both of these approaches to trust. It may be descriptively accurate that Wikipedia’s early epistemic successes depended on an attitude of hopeful trust and interpersonal connections. But the pressures of growth turned the community more to threat of punishment. However, as I have argued above, there are still remnants of hopeful trust’s openness to newcomers and articulation of a vision of trustworthiness aimed at inspiring new members. 8 For the applied epistemologist, the interesting question is whether these changing relations of trust support or undermine Wikipedia’s epistemic goals.

4  Wikipedia’s gaps and biases

Lack of diversity is an epistemic problem for Wikipedia. Estimates of gender diversity of Wikipedians range from 9% to 16% women (Hill and Shaw 2013; Wikimedia Foundation 2011). The Wikimedia Foundation does not collect data on the race or ethnicity of editors in its surveys, and there is a lack of other data, but anecdotal evidence suggests that Wikipedia lacks multicultural diversity (Backer 2015; Murphy 2015; Reynosa 2015). Wikipedia also has a Western bias (Hern 2015). There is no consensus on the causes of Wikipedia’s diversity problem. Determining the true causes of the diversity gap requires careful empirical study and falls outside the purview of the applied epistemologist. Applied epistemology can contribute (1) an analysis of the epistemic significance of the imbalance and (2) an evaluation of the epistemic consequences of proposed solutions to the gap. I address both.

The demographics of Wikipedia editors have several epistemic implications. First, concerns about diversity on Wikipedia are most commonly framed as concerns about gaps in coverage of topics related to women, people of color, and non-Western subjects, which is largely a concern about the power (in Goldman’s sense) of Wikipedia. For instance, Lam et al. (2011) found that topics of interest to women have shorter articles in Wikipedia than do topics of interest to men. And women’s biographies only account for 15.5% of all biographies in Wikipedia (Graells-Garrido et al. 2015). If women editors are more likely than men to create biographies of women, then attracting more women editors may increase the number of biographies of women, thereby increasing the number of truths available on Wikipedia, since more biographies will be added. 9 Thus, lack of diversity means that fewer truths are available to cognizers.

Second, lack of diversity undermines the reliability of Wikipedia. In discussions of Wikipedia’s reliability, it is often argued that errors are weeded out by communal scrutiny. But as feminist epistemologists have long argued, communities that lack diversity can perpetuate bias. Background assumptions shaped by individuals’ social locations can shape the claims they consider supported by evidence. Errors can be detected when putative knowledge claims are subjected to critical scrutiny by people from different backgrounds who can recognize and often detect unconscious biases with the background assumptions and purported evidential support for false claims (Frost-Arnold 2014a: 71; Goldman 1999: 78; Intemann 2010; Longino 1990). One of the main projects of feminist science studies has been to provide case studies of knowledge communities that detected errors and bias when women or people of color entered scientific disciplines in greater numbers. Returning to Wikipedia, this means that with fewer women, people of color, and members of other underrepresented groups making edits or taking part in the discussions on article talk pages, fewer errors are detected and the reliability of Wikipedia suffers.

Third, lack of diversity facilitates the proliferation of misleading truths. Women’s biographies on Wikipedia contain more marriage-related events than do biographies of men (Graells-Garrido et al. 2015). There is an epistemic problem here, but what is it? Consider the statement in a particular woman’s biography that she was married to so-and-so. Assuming the spouse was correctly identified, this statement might not seem epistemically problematic, because it is true. Perhaps the problem is a failure of power because Wikipedia is missing some truths (namely, the marital status of many men), but this fails to fully capture the problem. It is not just that some truths are missing; it is that the truths which do appear, in conjunction with the absence of others, can be misleading and harmful. A reader of Wikipedia biographies is exposed to misleading evidence – based on what they read, they have reason to believe the following false claim (among others) ‘A woman’s spouse is more relevant to her accomplishments than a man’s spouse is.’ 10 This is epistemically problematic, as a false claim, but it is also ethically harmful as it contributes to the devaluation of women’s accomplishments and promotes the stereotype that women’s identities are more tied to their families than to their professional accomplishments. While women are also vulnerable to unconscious bias and can also perpetuate stereotypes, it is likely that women have more at stake in correcting such misleading information in Wikipedia. Therefore, more diversity in editorship seems likely to diminish Wikipedia’s problem with misleading truths.

The Wikimedia Foundation set the goal of increasing women Wikipedians to 25% by 2015 (Cohen 2011). Although this initiative has failed, many other projects aim to improve Wikipedia’s gender and multicultural diversity (“AfroCROWD:About”; “Wikipedia:WikiProject Countering systemic bias”; “Wikipedia:WikiProject Women in Red”). Applied epistemology can be useful in assessing the epistemic merits of these proposed solutions. To illustrate, I show how one solution falls afoul of “the problem of speaking for others” (Alcoff 1991).

One of the suspected causes of Wikipedia’s diversity problem is its reliance on the notability guideline and verifiability policy. The notability guideline ensures that the truths in Wikipedia be non-trivial and interesting, 11 and the verifiability policy obviously makes Wikipedia more verifiable and trust in it more rationally justifiable. While epistemically helpful in these ways, notability and verifiability can perpetuate bias within a community lacking diversity (Backer 2015; Stephenson-Goodknight 2015). Assessments of an article’s notability and a source’s legitimacy are made in light of background assumptions, which can be biased by one’s social location. To a white, Western editor, an article on an artwork of cultural importance to an Asian subpopulation may not seem notable. The nutshell description of the notability guideline reads,

Wikipedia articles cover notable topics—those that have gained sufficiently significant attention by the world at large and over a period of time, and are not outside the scope of Wikipedia. We consider evidence from reliable independent sources to gauge this attention.

Wikipedia:Notability

Following this guideline requires assessing whether a topic has gained “sufficiently significant attention by the world at large,” which is a judgment made in light of assumptions about what makes attention significant, who or what counts as good indicators of the world’s attention, etc. Similarly, one must make judgments about whether there exist reliable independent sources to document this attention, and this will be done in light of background assumptions about which sources are reliable. The problem is that these background assumptions can be biased. Alice Backer, founder of Afro Free Culture Crowdsourcing Wikimedia (AfroCROWD), an initiative to increase the participation of people of African descent in Wikimedia (“AfroCROWD:About”), gives the example of the proposed article on “Garifuna in Peril,” a film about the Garifuna community (an Afro-Honduran community) (Backer 2015). The creator of the article, Garifuna blogger Teofilo Colon, tried multiple times to add the article to Wikipedia, but each time it was rejected by a user as non-notable and lacking independent verifiable sources to support notability, despite the fact that Colon continued to add dozens of sources, including well-established Central American newspapers. Now we cannot know with certainty the reasons why the Wikipedian who rejected the article found it to be non-notable and poorly sourced, but it is not hard to imagine that the typical white, Western Wikipedian is ignorant about Garifuna culture, unaware of the films that have meaning in Garifuna communities, and uninformed about Central American news sources. Our social location shapes our background knowledge, which biases our assessment of what is notable and which sources are reliable. Hence, not surprisingly, the contributions of women, people of color, and other marginalized communities are often rejected.

Now one proposed solution to this problem is for an experienced editor, who is a trusted Wikipedian, to post the new article on behalf of less experienced editors who belong to minority groups. Thus, when Wikipedians look at the new article and see that it was created by the trusted Wikipedian, they will be less likely to recommend it for deletion as non-notable. This solution was repeatedly proposed at WikiConference USA 2015, and examples were offered of instances in which this strategy had effectively added new articles, which had previously been deleted when submitted by people of color. Notice that this solution leverages Wikipedia’s relations of trust – it recognizes that a large number of previous edits and known history of valuable contributions to the community make one a trustworthy editor, and trust in the editor behind an article can override concerns about the notability or verifiability of the content.

However, applied epistemology reveals a problem with this solution. It runs afoul of what Linda Alcoff calls “The problem of speaking for others” (Alcoff 1991) and other problems with advocating on behalf of marginalized others (Code 2006; Frost-Arnold 2014b; Ortega 2006). Advocacy can be epistemically beneficial; it can allow the claims of marginalized speakers to be heard, when they might otherwise be ignored or rejected. However, advocacy also perpetuates a system in which the speech of marginalized people is not heard or accepted in their own voice. When people of color need to give their Wikipedia articles to established white Wikipedians to prevent their article from being deleted, we still have a system in which people of color are not trusted editors. Additionally, when an advocate speaks “in place of” a marginalized subject, the advocate buttresses their social status and epistemic credibility on the back of the epistemic labor of the marginalized. While the intentions of the advocate may be noble, this still perpetuates unjust privilege. White, established Wikipedians will ultimately receive the credit for the addition of the new articles in their editing history, and Wikipedians of color will still find that their content is rejected when added under their own identities. This solution does not push established Wikipedians to address their own biases or learn to trust new community members who belong to more diverse populations. While no strategies are without problems, other solutions may be more epistemically fruitful. Collaborations between applied epistemologists and Wikipedians could be a source of new ideas for addressing the diversity gap.

In conclusion, Wikipedia is fertile ground for applied epistemology. As a dynamic and transparent community with explicitly epistemic goals, Wikipedia provides opportunities to study the production and dissemination of knowledge through online collaboration. While debates persist about the reliability and verifiability of Wikipedia’s content, Wikipedia is constantly adjusting to protect its epistemic standing. Wikipedia’s shifting relations of trust and distrust raise important normative questions about which modes of social organization are best suited for epistemic communities at various stages of development. And, while the epistemic damage due to a lack of diversity is well-traveled terrain for the applied epistemology of science, Wikipedia’s diversity gaps both display similar problems and raise new challenges for a digital age.

Acknowledgments

I thank Teofilo Colon, Michael Hunter, Alla Ivanchikova, Rosie Stephenson-Goodknight, K. Brad Wray, and an anonymous reviewer for helpful comments.

Notes

Encyclopedia Britannica critiqued the Nature study (Encyclopedia Britannica 2006). See Nature (2006), for Nature’s rebuttal.

Sanger recognizes that there is some accountability but thinks accountability that does not target one’s offline identity is insufficient to prevent some of the poor behavior on Wikipedia.

In fact, some question whether Wikipedia provides testimony at all (Tollefsen 2009; Wray 2009).

Editing history is a mark of trustworthiness in Wikipedia. Wikipedians often introduce themselves by listing their number of edits, and many post summaries of their editing history on their user pages.

For quotes from other Wikipedians espousing Wales’s hopeful vision, see de Laat (2010).

De Laat (2010) analyzes Wikipedia’s balance between trust and distrust in terms of the discretion offered to editors as a result of hopeful trust tempered with a decrease in discretion due to increasing rules and governance tools.

Tollefsen (2009) argues that Wikipedia’s accountability mechanisms make it compatible with the assurance theory of testimony, according to which hearers are entitled to accept testimony because the speaker offers their assurance of the truth of the testimony and accepts responsibility for it. However, de Laat (2010) responds that accountability should not be confused with offering assurances, which anonymous editing precludes.

For a useful discussion of the design challenges of balancing openness to newcomers with vigilance against vandals, see Halfaker et al. (2014).

Of course, more women editors adding more articles on women will only add more truths to Wikipedia if women are as equally reliable as men, which there is no reason to doubt. Also, it is hard to obtain data about whether higher percentages of women Wikipedians will lead to more truths being added to the encyclopedia (rather than the removal of false claims or a change in presentation of existing claims). That said, the goal of many projects aimed at decreasing the diversity gap is to increase the number of women writing articles (“Wikipedia:WikiProject Countering systemic bias”).

Additionally, readers are misled about which topics (such as women’s marital status) are important. I thank an anonymous reviewer for this point.

See Fallis (2006) and Goldman (1999: 94–96) on the epistemic value of interesting truths.

References

“Active Wikipedians.” (n.d.). In Wikimedia. Retrieved December 16, 2015, from https://stats.wikimedia.org/EN/TablesWikipediansEditsGt5.htm.
“AfroCROWD:About.” (n.d.). In AfroCROWD. Retrieved December 14, 2015, from www.afrocrowd.org/?q=content/about.
Alcoff, L. (1991). “The Problem of Speaking for Others.” Cultural Critique, 20: 5–32.
Backer, A. (2015). “AfroCROWD—Bridging the Multicultural Gap.” WikiConference USA, October 10, 2015, Washington, DC. Retrieved December 14, 2015, from www.youtube.com/watch?v=WkHbg9V5wnI.
Coady, D. (2012). What to Believe Now: Applying epistemology to contemporary issues. Malden, MA: Wiley-Blackwell.
Code, L. (2006). Ecological Thinking: The politics of epistemic location. New York: Oxford University Press.
Cohen, N. (2011). “Define Gender Gap? Look up Wikipedia’s Contributor List.” The New York Times, January 30, 2011. Retrieved December 14, 2015, from www.nytimes.com/2011/01/31/business/media/31link.html?_r=0.
de Laat, P. B. (2010). “How Can Contributors to Open-Source Communities Be Trusted?” Ethics and Information Technology, 12(4): 327–341.
de Laat, P. B. (2015). “The Use of Software Tools and Autonomous Bots against Vandalism: Eroding Wikipedia’s moral order?” Ethics and Information Technology, 17(3): 175–188.
Encyclopedia Britannica, Inc. (2006). “Fatally Flawed: Refuting the recent study on encyclopedia accuracy by the journal Nature.” Retrieved December 13, 2015, from https://corporate.britannica.com/britannica_nature_response.pdf.
Erhart, E. , and Barbara, J. (2015). “Hundreds of ‘Black Hat’ English Wikipedia Accounts Blocked following Investigation.” Wikimedia blog. Retrieved December 8, 2015, from http://blog.wikimedia.org/2015/08/31/wikipedia-accounts-blocked-paid-advocacy/.
Estlund, D. M. (1994). “Opinion Leaders, Independence, and Condorcet’s Jury Theorem.” Theory and Decision, 36(2): 131–162.
Fallis, D. (2006). “Epistemic Value Theory and Social Epistemology.” Episteme, 2(3): 177–188.
Fallis, D. (2008). “Toward an Epistemology of Wikipedia.” Journal of the American Society for Information Science and Technology, 59(10): 1662–1674.
Fallis, D. (2011). “Wikipistemology,” in A. I. Goldman and D. Whitcomb (eds.), Social Epistemology: Essential readings. New York: Oxford University Press.
Frost-Arnold, K. (2014a). “Trustworthiness and Truth: The epistemic pitfalls of Internet accountability.” Episteme, 11(1): 63–81.
Frost-Arnold, K. (2014b). “Imposters, Tricksters, and Trustworthiness as an Epistemic Virtue.” Hypatia, 29(2): 790–807.
Giles, J. (2005). “Internet Encyclopedias Go Head to Head.” Nature, 438: 900–901.
Goldman, A. (1992). Liaisons: Philosophy meets the cognitive and social sciences. Cambridge, MA: The MIT Press.
Goldman, A. (1999). Knowledge in a Social World. New York: Oxford University Press.
Goldman, A. (2011). “The Social Epistemology of Blogging,” in J. van den Hoven and J. Weckert (eds.), Information Technology and Moral Philosophy, New York: Cambridge University Press.
Graells-Garrido, E. , Lalmas, M. , and Menczer, F. (2015). “First Women, Second Sex: Gender bias in Wikipedia.” arXiv:1502.02341 [cs.SI].
Halfaker, A. , Geiger, R. S. , and Terveen, L. G. (2014). “Snuggle: Designing for efficient socialization and ideological critique.” Proceedings CHI 2014. ACM Press: 311–320.
Hartelius, E. J. (2011). The Rhetoric of Expertise. New York: Lexington Books.
Hayek, F. (1945). “The Use of Knowledge in Society.” American Economic Review, 35(4): 519–530.
Healy, K. (2007). “Wikipedia Follies.” Crooked Timber. Retrieved December 13, 2015, from http://crookedtimber.org/2007/02/04/wikipedia/.
Heilman, J. M. , Kemmann, E. , Bonert, M. , Chatterjee, A. , Ragar, B. , Beards, G. M. , … Laurent, M. R. (2011). “Wikipedia: A key tool for global public health promotion.” Journal of Medical Internet Research, 13(1): e14.
Hern, A. (2015). “Wikipedia’s View of the World is Written by the West.” The Guardian, September 15, 2015. Retrieved December 13, 2015, from www.theguardian.com/technology/2015/sep/15/wikipedia-view-of-the-world-is-still-written-by-the-west.
Hill, B. M. , and Shaw, A. (2013). “The Wikipedia Gender Gap Revisited: Characterizing survey response bias with propensity score estimation.” PloS ONE, 8(6): e65782.
Intemann, K. (2010). “25 Years of Feminist Empiricism and Standpoint Theory: Where are we now?” Hypatia, 25(4): 778–796.
Kräenbring, J. , Penza, T. M. , Gutmann, J. , Muehlich, S. , Zolk, O. , Wojnowski, L. , … Sarikas, A. (2014). “Accuracy and Completeness of Drug Information in Wikipedia: A comparison with standard textbooks of pharmacology.” PloS ONE, 9(9): e106930.
Kupferberg, N. , and Protus, B. (2011). “Accuracy and Completeness of Drug Information in Wikipedia: An assessment.” Journal of the Medical Library Association, 99(4): 310–313.
Lam, S. T. K. , Uduwage, A. , Dong, Z. , Sen, S. , Musicant, D. R. , Terveen, L. , and Riedl, J. (2011). “WP: Clubhouse? An exploration of Wikipedia’s gender imbalance.” Proceedings of the 7th International Symposium on Wikis and Open Collaboration. ACM Press: 1–10.
Lih, A. (2009). The Wikipedia Revolution: How a bunch of nobodies created the world’s greatest encyclopedia. London: Aurum Press.
Lih, A. (2015). “What Wikipedia Must Do.” WikiConference USA, October 9, 2015, Washington, DC. Retrieved December 14, 2015, from www.youtube.com/watch?v=Gj6U22uJzGM.
Lih, A. , McGrady, R. , Ramjohn, I. , and Ross, S. (2015). “Thinking (and Contributing) Outside the Editing Box: Alternative ways to engage subject-matter experts.” WikiConference USA, October 10, 2015, Washington, DC.
“List of Wikipedias.” (n.d.). In Wikimedia Meta-Wiki. Retrieved December 16, 2015, from https://meta.wikimedia.org/wiki/List_of_Wikipedias.
Longino, H. (1990). Science as Social Knowledge. Princeton, NJ: Princeton University Press.
Magnus, P. D. (2009). “On Trusting Wikipedia .” Episteme, 6(1): 74–90.
McGeer, V. (2008). “Trust, Hope and Empowerment.” Australasian Journal of Philosophy, 86(2): 1–18.
Murphy, C. (2015). “Can ‘Black Wikipedia’ Take Off like ‘Black Twitter’?” Colorlines. Retrieved December 15, 2015, from www.colorlines.com/articles/can-black-wikipedia-take-black-twitter.
Nature. (2006). “Nature’s Responses to Encyclopedia Britannica.” Nature, 438: 900–901. Retrieved December 13, 2015, from www.nature.com/nature/britannica/.
Ortega, M. (2006). “Being Lovingly, Knowingly Ignorant: White feminism and women of color.” Hypatia, 21(3): 56–74.
Owens, S. (2013). “The Battle to Destroy Wikipedia’s Biggest Sockpuppet Army.” The Daily Dot. Retrieved December 8, 2015, from www.dailydot.com/lifestyle/wikipedia-sockpuppet-investigation-largest-network-history-wiki-pr/.
Pfister, D. (2011). “Networked Expertise in the Era of Many-to-Many Communication: On Wikipedia and invention.” Social Epistemology, 25(3): 217–231.
“Reliability of Wikipedia.” (n.d.). In Wikipedia. Retrieved December 4, 2015, from https://en.wikipedia.org/wiki/Reliability_of_Wikipedia.
Reynosa, P. (2015). “Why Don’t More Latinos Contribute to Wikipedia?” El Tecolote. Retrieved December 15, 2015, from http://eltecolote.org/content/en/commentary/why-dont-more-latinos-contribute-to-wikipedia/.
Sanger, L. M. (2009). “The Fate of Expertise after Wikipedia .” Episteme, 6(1): 52–73.
Seigenthaler, J. (2005). “A False Wikipedia ‘Biography’.” USA Today, November 29, 2005. Retrieved December 13, 2015, from http://usatoday30.usatoday.com/news/opinion/editorials/2005-11-29-wikipedia-edit_x.htm.
Silverman, M. (2012). “Encyclopedia Britannica vs. Wikipedia.” Mashable. Retrieved December 17, 2015, from http://mashable.com/2012/03/16/encyclopedia-britannica-wikipedia-infographic/#Olumu22uvkq6.
Simon, J. (2010). “The Entanglement of Trust and Knowledge on the Web.” Ethics and Information Technology, 12(4): 343–355.
Stephenson-Goodknight, R. (2015). “Women … It Takes a Village.” WikiConference USA, October 10, 2015, Washington, DC. Retrieved December 14, 2015, from www.youtube.com/watch?v= WkHbg9V5wnI.
Sunstein, C. (2006). Infotopia. New York: Oxford University Press.
Surowiecki, J. (2004). The Wisdom of Crowds. New York: Doubleday.
Temple, N. J. , and Fraser, J. (2014). “How Accurate Are Wikipedia Articles in Health, Nutrition, and Medicine?” Canadian Journal of Information and Library Science, 38(1): 37–52.
Thagard, P. (1997). “Collaborative Knowledge.” Noûs, 31(2): 242–261.
Tollefsen, D. P. (2009). “ Wikipedia and the Epistemology of Testimony.” Episteme, 6(1): 8–24.
Wales, J. (2005). “Wales Interview Transcript.” Retrieved December 10, 2015, from https://en.wikipedia.org/wiki/User:One/Wales_interview_transcript.
Wales, J. (2009). “Foreword,” in The Wikipedia Revolution: How a bunch of nobodies created the world’s greatest encyclopedia. London: Aurum Press.
Wikimedia Foundation. (2011). Wikipedia Editors Study. Retrieved December 15, 2015, from https://wikimediafoundation.org/w/index.php?title=File%3AEditor_Survey_Report_-_April_2011.pdf&page=1.
“Wikipedia:About.” (n.d.). In Wikipedia. Retrieved December 16, 2015, from https://en.wikipedia.org/wiki/Wikipedia:About.
“Wikipedia:Assume good faith.” (n.d.). In Wikipedia. Retrieved December 10, 2015, from https://en.wikipedia.org/wiki/Wikipedia:Assume_good_faith.
“Wikipedia:Be bold.” (n.d.). In Wikipedia. Retrieved December 10, 2015, from https://en.wikipedia.org/wiki/Wikipedia:Be_bold.
“Wikipedia:Conflict of interest.” (n.d.). In Wikipedia. Retrieved December 8, 2015, from https://en.wikipedia.org/w/index.php?title=Wikipedia:Conflict_of_interest&oldid=694255381.
“Wikipedia:Core content policies.” (n.d.). In Wikipedia. Retrieved December 16, 2015, from https://en.wikipedia.org/wiki/Wikipedia:Core_content_policies.
“Wikipedia:Editing restrictions.” (n.d.). In Wikipedia. Retrieved December 10, 2015, from https://en.wikipedia.org/wiki/Wikipedia:Editing_restrictions.
“Wikipedia:Expert editors.” (n.d.). In Wikipedia. Retrieved December 4, 2015, from https://en.wikipedia.org/wiki/Wikipedia:Expert_editors.
“Wikipedia:List of citogenesis incidents.” (n.d.). In Wikipedia. Retrieved October 16, 2015, from https://en.m.wikipedia.org/wiki/Wikipedia:List_of_citogenesis_incidents.
“Wikipedia:Long-term abuse/Orangemoody.” (n.d.). In Wikipedia. Retrieved December 8, 2015, from https://en.wikipedia.org/w/index.php?title=Wikipedia:Long-term_abuse/Orangemoody&oldid=688451455.
“Wikipedia:Notability.” (n.d.). In Wikipedia. Retrieved December 2, 2015, from https://en.wikipedia.org/wiki/Wikipedia:Notability.
“Wikipedia:Please do not bite the newcomers.” (n.d.). In Wikipedia. Retrieved December 10, 2015, from https://en.wikipedia.org/wiki/Wikipedia:Please_do_not_bite_the_newcomers.
“Wikipedia:Statement on Wikipedia from participating communications firms.” (n.d.). In Wikipedia. Retrieved December 8, 2015, from https://en.wikipedia.org/wiki/Wikipedia:Statement_on_Wikipedia_from_participating_communications_firms.
“Wikipedia:WikiProject Countering systemic bias.” (n.d.). In Wikipedia. Retrieved October 16, 2015, from https://en.m.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias.
“Wikipedia:WikiProject Women in Red.” (n.d.). In Wikipedia. Retrieved October 16, 2015, from https://en.m.wikipedia.org/wiki/Wikipedia:WikiProject_Women/Women_in_Red.
Wray, K. B. (2009). “The Epistemic Cultures of Science and Wikipedia: A comparison.” Episteme, 6(1): 38–51.
Search for more...
Back to top

Use of cookies on this website

We are using cookies to provide statistics that help us give you the best experience of our site. You can find out more in our Privacy Policy. By continuing to use the site you are agreeing to our use of cookies.