ТОП просматриваемых книг сайта:
Politics of Disinformation. Группа авторов
Читать онлайн.Название Politics of Disinformation
Год выпуска 0
isbn 9781119743316
Автор произведения Группа авторов
Издательство John Wiley & Sons Limited
From a broader perspective, low-quality content generates data-as-capital as well as reliable information as long as it can attract users (Graham 2017). Under platform guidance, the trustworthiness of the testifier (e.g. journalistic or scientific sources) is irrelevant to the capital accumulation cycle. Social media affordances positively reward content that can trigger high user engagement by sharing, comments, and reaction rates, despite the absence of any informative value. Often this content falls under compelling stories with dramatic appeal. It is for this reason that Nieborg and Poell (2018, p. 9) state, “publishers pursuing platform-oriented distribution strategies are subject to strong direct network effects, as platform sharing practices and algorithmic curation tend to favor viral content.” That is, even when news organizations enter the digital sphere controlled by platform companies, they are subjected to the same productive logic (Bell and Owen 2017).
In the multi-sided markets of platform management, journalism also acts as an intermediary of data circulation. These markets are characterized by the aggregations of end users and third parties (license holders, content creators, telecoms operators, investors) by intermediating platforms (Schwarz 2017). Following Nieborg and Poell (2018), before the consolidation of digital platforms, large media publishers exerted control over the ad industry because they had privileged access to a mass audience. As a prototypical two-sided market, the relationship between the advertisers and the public was consistently mediated by the news industry (Nieborg and Poell 2018). Nowadays, the power publishers once had with advertisers has been replaced by microtargeted advertising from Google and Facebook (Braun and Eklund 2019). In addition, advertisers that take part in automated auctions often do not know whether their ad will be placed with a trustworthy publisher or on a fake news page (Graham 2017). The main criterion that leads advertisers to buy banners via Google AdSense, for instance, is their targeted user behavior tracked by data mining. Nowadays, the publishers’ acquired credibility no longer matters as a distinctive appeal to advertisers since they can reach potential consumers with specific demographic characteristics without relying on news media products (Nielsen and Ganter 2017). We could add Facebook’s systematical refusal to fact-check targeted ad campaigns run on its platform (even the most misleading ones) to this picture, as it claims its status as an independent intermediary and not a media organization.
More recently (specifically in 2016), Facebook has subtly accepted its editorial role, outsourcing its moderation activities by establishing partnerships with independent fact-checking organizations worldwide to check flagged content that spreads on its platform. Suspicious claims detected as disinformation have their reach diminished and are linked to verification texts written by professional fact-checkers. Not coincidentally, in the same year, Google also launched its program to improve the ranking of fact-checking content in search results and Google News (Graves and Anderson 2020). However, above all, these initiatives are an institutional response of the companies to their alleged improper influence in Western democracies’ elections and referendums. With these programs, they can argue that they are making all feasible efforts to fight fake news, sidestepping discussion of their business models by calling on the principle of journalistic “independence.” As a result, as Nicey and Bigot (2020) have argued, fact-checking organizations have gradually redirected their activities to the very agenda of junk news, which has systematically come up on the digital platforms.
In addition, several news organizations started employing the same techniques that fake news sites use to attract their audience as soon as they realized the economic logic of platform companies (e.g. “clickbait” stories, Search Engine Optimization techniques). Therefore, low-quality information can spread sufficiently fast to maintain the circulation of data and its subsequent value extraction. In the news media, scholarly literature highlights how the downsizing of newsrooms is one of the outcomes of this new business logic (Figaro et al. 2015). Consequently, reduced staff limits the quality of articles written by the remaining journalists. Besides, the news industry is gradually submitting to social media affordances, unbundling articles to distribute them on many platforms as possible (Bell and Owen 2017). As a result, journalists are increasingly specialized in data analytics to develop strategies to split news stories into attractive formats to “adequately respond to [the] evolving interests of platform users” (Nieborg and Poell 2018, p. 13).
Until now, our argument has aligned with the aforementioned infrastructural approach to fake news. Nevertheless, we should continue by showing the first signs of the current surge of disinformation in past times. From the forged “supplement” of the Boston Independent Chronicle written by Benjamin Franklin in 1782 to a treaty between the US and England (Pepp et al. 2019) to the famous “moon hoax” first published by the New York Sun in 1835 (Allcott and Gentzkow 2017), tabloid newspapers in the past also promoted misleading news stories for financial gains. For instance, in the nineteenth century, tabloid publications like the Illustrated Police News in the UK and the Kreuzzeitung in Germany were notorious for the entirely fictional stories published on their pages (Dentith 2017). As also noted by Pepp et al. (2019), the main differences between these past examples of fake news and our current epistemic environment are threefold: “The newly emergent distributional infrastructure of social media, the correspondingly new speech act of social media sharing, and our shifting attitudes toward journalism as an institution” (Pepp et al. 2019, p. 86).
We could add to this picture that previous fake news depended on the press to gain traction. Misleading stories must be printed by some organizations to circulate in the public sphere. In addition, newspapers were inserted into the former capitalist accumulation model as cultural commodities exchanged for profit making. From a materialist standpoint, we could summarize this previous logic of capital circulation as follows: “the cycle of consumption is motivated by the use-value of a commodity, and it is completed when money is turned into a commodity” (Sadowski 2019, p. 3). Since the use-value of modern newspapers was gradually associated with their public reliability (as we mentioned previously), the continuous publication of fictional stories was not suitable for the news industry. Only tabloid-style newspapers (also known in the US as the “penny press”) were successful in publishing low-quality articles at the expense of their credibility.
Since the present capital accumulation cycle is marked by data extraction, even cultural commodities are valued by their modularization to platform monetization. As publishers gradually become content developers (Nieborg and Poell 2018), cultural production is now dependent on the data-driven process. The very agenda of digital news platforms is often based on “trending social media topics and popular search terms” (Nieborg and Poell 2018, p. 2). However, even with reduced staff, news organizations have infrastructural costs that fake news producers can simply bypass. Producers of misleading stories can quickly launch a site that mimics the legacy media format, fill it with programmatic advertising banners, and attract the audience via an initial investment in click farms and bots that boost traffic to their web pages.
Moreover, sometimes, fake news may be more appealing than stories from legitimized testifiers since their producers’ primary concern is the fabrication of content with the best viral impact. Consequently, disinformation frequently generates more data to keep the capital accumulation cycle in place. For this reason, the current surge in fake news could be seen as an unforeseen consequence of our prevalent value-creation logic that will probably not be constrained in upcoming years.
Conclusion
In this chapter, we first developed a systematic review of the scholarly literature regarding fake news from 2017 to 2019 to advance our contribution to this ongoing discussion from a materialistic viewpoint (mainly based on the current appropriations of Marxist thought in studies of platforms). From our perspective, the growing body of research about fake news could be distinguished between descriptive and critical approaches, whose primary purposes vary due to the epistemological landscape. The descriptive research, focusing on the characterization, infrastructure, and effects of fake news, aims to debug the term from its common-sense use. This study’s line offers a crucial conceptualization of the notion and outlines some of its material roots (e.g. technological shifts, socio-political environments). Nevertheless, it lacks a macro account of the surge of fake news that clarifies why what we are experiencing now regarding disinformation is linked to broader historical changes.