ТОП просматриваемых книг сайта:
The Handbook of Peer Production. Группа авторов
Читать онлайн.Название The Handbook of Peer Production
Год выпуска 0
isbn 9781119537090
Автор произведения Группа авторов
Жанр Кинематограф, театр
Издательство John Wiley & Sons Limited
2 Peer‐to‐Peer Infrastructure
In the early years of the second millennium, the word “peer” became widely known because of the conjunction of two distinct understandings, one scientific, the other popular. On the scientific side, legal scholar Yochai Benkler (2002) proposed in his journal article “Coase’s Penguin, or Linux and the Nature of the Firm” a seminal understanding of free and open source software (FOSS) as a form of “commons‐based peer production” whose productive efficiency, based on the ease and speed of incorporating multiple contributions to an object, surpassed that of firms and markets. Meanwhile in the Global North more generally, the notion of “peer‐to‐peer” generated wide public interest. This derived from the popularity of practices enabled by the non‐centrally controlled, or distributed, structure of the early mass Internet, prior to its subsequent enclosure by proprietary social media platforms (see Birkinbine, this volume; Kostakis & Bauwens, this volume). Such practices, whose archetype was the Napster file‐sharing service, included torrenting or exchanging files online for free. What was truly original about Napster is that files available for download were not located in a central computer: these files were stored on the user’s machines, who made them available to others through Napster’s (centrally hosted) software. Each node, wherever it was located in the world, was accessible and contributed to the peer‐to‐peer system.
This collaborative production and exchange of content, knowledge, and systems involved participants with varying degrees of ownership and control of the (software/hardware) means of production. A system like Napster relied on participants to function, and they in turn could use the service for free, but Napster soon became a for‐profit company (Alderman, 2001). Now, in the second millennium’s third decade, we face a somewhat different situation. Peer‐to‐peer practices such as torrenting have been almost completely criminalized out of existence, but the Napster model of using and contributing to an online service for free became widespread in the mid‐2000s, with Facebook an emblematic example. In terms of architecture, for many people the Internet is now a content delivery model on closed platforms such as social media or entertainment streaming networks, not a system allowing users to perform effective peer‐to‐peer networking. To be sure, peer production emerged in the 1990s and 2000s despite the network’s physical infrastructure – the fiber‐optic submarine links, terrestrial cables, data centers, cloud storage and Internet of things – being privately owned. A similar paradox concerns the principle of net neutrality, the idea that Internet Service Providers (ISPs) and governments should treat all data equally – instead of charging users differentially or limiting access to certain platforms or applications. It is less of a surprise that net neutrality has lasted this long if we understand it as an example of the neoliberal principle of free and undistorted competition (Cohen, 2019).
It now becomes necessary to distinguish an expansive definition of infrastructure as pervasive digital arrangements, from a narrow one that focuses on physical and material settings only. In restrictive terms, when it comes to peer‐to‐peer physical infrastructure, or “built networks that facilitate the flow of goods, people, or ideas and allow for their exchange over space” (Larkin, 2013), the potential for non‐corporate users to autonomously own and control a global network has been neutralized. In contrast, when it comes to a more expansive definition, the situation is reversed. It should be noted that “infrastructure” is not solely limited to material components: “beyond bricks, mortar, pipes or wires, infrastructure also encompasses more abstract entities, such as protocols (human and computer), standards, and memory” (Bowker et al., 2010, p. 97).
Peer produced digital infrastructure, that is to say free and open source software, is ubiquitous online. Let us consider the foundational LAMP open source web application acronym (Linux, Apache, MySQL, Perl/PHP/Python): Google owes its dominance to Linux (used in Android and Chrome OS); Apache powers 40% of the Internet’s web servers; without the MySQL database, there would be no online commerce (Paypal, Amazon), social media (Facebook, Twitter, LinkedIn), or “sharing economy” (Uber, Yelp); Perl/PHP/Python are also highly popular programming languages. As for Wikipedia, it is no longer an unreliable joke, but a legitimate source of correctness in the age of networked disinformation – in fact, it is among the most popular websites in the world (van Dijck, 2013). The narrow definition of peer produced infrastructure – such as torrenting – was effectively banned; the expansive one – such as updating the Linux kernel – permitted, and put to work for the global communication network.
What might still be possible in the future, despite the platformization of the Internet and arbitrary regulatory mechanisms? Furthermore, what kind of resilient infrastructures will foster people’s ability to participate in peer production without over‐consuming natural resources and contributing to the destruction of the biosphere?
3 The Exclusive Attraction of Commons‐Based Peer Production
We consider first what peer producers are building right now. Many continue to focus on autonomous infrastructure in order to oppose the technological giants and offer an alternative to civil society. Examples include distributed physical infrastructure at the local level, in the form of mesh or wireless community networks (such as Guifi2 in Catalonia, Freifunk3 in Berlin, and many others; see Shaffer, this volume); peer‐to‐peer encrypted messaging and forums (such as Briar;4 see also Velasco Gonzáles & Tkacz, this volume); hackerspaces, hacklabs, and biohacklabs (see Boeva & Troxler, this volume; Meyer, this volume) community telecommunications infrastructure such as Rhizomatica in Mexico, Columbia and Brazil (see interview with Bloom, this volume; Shaffer, this volume); free digital libraries where copyrighted material can be found and uploaded, such as Memory of the World,5 Library Genesis,6 and Monoskop;7 tech collectives that peer produce services geared towards activists such as VPNs, file sharing, server space, and many others. At the hardware level, peer production projects have developed open source machines, tools, and infrastructure which fight against pollution, such as precious plastic,8 who make available blueprints showing how to build plastic‐recycling machines (see also Braybrooke & Smith, this volume).
Many others have embraced the expansive definition of infrastructure by contributing to Wikipedia (see Haider & Sundin, this volume), project Gutenberg (a volunteer‐run text digitization project), or by uploading code commits to GitHub (a code sharing, publishing service and social networking site for programmers whose “social coding” has proved wildly popular with the FOSS community). Why has distributed collaboration between volunteers proved so successful? The promise that peer production is always radically decentralized, collaborative, and nonproprietary (Benkler, 2006, p. 60) has not eventuated: the most technologically advanced forms of peer production have hybridized with the market, as detailed in our next section. But Benkler did not just define peer production’s infrastructural characteristics: he also addressed the moral benefits of sharing resources and of self‐determination. Peer production, as a way of working collaboratively with peers, can only thrive if people treat each other with respect and dignity. The cumulative impact of non‐exploitative micro‐actions is profound;