A collaborative space in front of “Impact” professionals:…

Thanks to the first testimony published in Fakir newspaper, Mediapart revealed on June 27 how Avisa Partners, a French company specializing in intelligence, had spread fake press articles and fake blogs on behalf of an international client. In response to the publication of this survey and our post “Arming Professionals Against Misinformation,” several of our readers sent us messages of support or comments. We were contacted again by Jules, the volunteer administrator of the collaborative encyclopedia Wikipedia, with whom we exchanged a few months ago. His testimony, unexpectedly, will allow a new thread to be drawn from the investigation, on which Mediapart publishes a second part today.

Within their community, based on a horizontal and decentralized organization, “Wikipedians” have only editorial powers: change encyclopedia articles, insert warning banners, etc. They can also suggest removing a page whose content they find problematic. Eligibility criteria or immediate deletion of a page (for example, in case of offensive content), are freely accessible to specify the procedure to be followed. Part of Jules’ commitment (one of the most involved French members of the community) consists of “patrolling” together with other volunteers, to ensure that those who make changes to the page respect Wikipedia’s founding principles and rules. .

“The collaborative aspect of moderation allows for mutual control, specific to the platform’s philosophy of self-management. There is no editor-in-chief, no committee that controls content »He explained to us during our first meeting.

How Wikipedia Agency Hunts

For several years, Wikipedians have been looking for “false noses,” accounts associated with multiple identities to spread opinions or rumors more effectively. They also targeted single-use accounts, which were created to modify targeted pages (in most cases, related to companies or personalities), improve one’s reputation or hurt others. In 2018, the community even launched an “antipub” project to flush out promotional content, Criticism that undermines the neutrality of viewpoints and the site’s encyclopedic mission ». On the page introducing the project, contributors warn: “LThe proliferation of communications organizations that do not respect Wikipedia’s founding principles necessitates better monitoring of user accounts and associated articles. »

When we meet Jules, he has already begun to take an interest in Avisa’s occult activities. In his view: the insertion of false sources and the suspicion of direct intervention of the organization’s employees in the articles from the encyclopedia. Faced with this common challenge, we decided to cross-reference our data. Wikipedia gives us a list of over 2,300 links to the Mediapart Club mentioned in the encyclopedia. We found seven posts there (unpublished as of June 2022) put online by five “Avisa” blogs, which we report to them in return. During their internal investigation, Wikipedia contributors discovered that the “antipub” project had been hacked by a contributor acting for the agency.Online reputation. Two more accounts were blocked earlier in the month for the same reason.

Against the background of this one-point collaboration, a question drives us: how to protect participatory and collaborative spaces from these practices that denigrate debates? What moderation policies should professionals adopt to better protect against “influence”?

Community protection and their evolution

At Wikipedia, users started asking themselves this question several years ago. In the beginning, everyone wrote without citing too many sources. It wasn’t necessary »Jules recalls. This requirement appeared in the second half of the 2000s, in the face of criticism, especially from teachers and the academic world. We then gradually moved away from the requirement of verifiability (which is that the source of the information is not cited in the article) to the requirement of citing sources recognized as reliable. » Today, to allow contributors to find their way, pages offer criteria for identifying reliable, quality sources or even defining primary and secondary sources. In full transparency, A Source Observatory also lists ongoing discussions on the reliability of some of them. Every volunteer can suggest a correction to these pages.

Faced with the new demands of its readers, in 2005 the community created the “From Source” warning banner, which is now attached to hundreds of thousands of encyclopedia pages. Since then, a whole palette of warning messages has been developed. These banners allow, for example, to report suspected plagiarism or “promotional content”. Inserted at the beginning of articles by contributors (following a consensus), their goal is to list pages by category, encourage readers to revise them, and provide a critical look at these contents.

At first, these messages are transient – but most remain in place for years until the problem is resolved.Jules notes. Their interest is that they allow great transparency in internal discussions, on risks assessed by contributors. »

Defects in this security system

As with MediaPart’s participation certificate, this set of protections inevitably has limits. Already in 2012, a literature teacher created buzz about how he scammed his students, despite himself, by altering a Wikipedia page for educational purposes. More systematically, “editing wars” (disagreements between one or more contributors to an article) sometimes give voluntary moderation groups a hard time.

“Underneath the Keyboard, Rage” in Comic Book Investigation, The Comic magazine Back in late 2019, that turned the “Yellow Vest Movement” page into a real battleground. Rebellion is then the political definition and description of this unprecedented social movement. The article then broke records for reads (64,000 in two weeks) and editions (2,360 changes were published by 243 people in three weeks). At the club, historian Nicolas Leberge recently described, based on an article in Numerama, the ongoing struggle on Elizabeth Bourne’s page, changed 550 times after her appointment to Matignon. When these battles become too violent, Wikipedians can call in volunteer mediators, or eventually, bring in an administrator who will defend the page.

Latest textbook case: July 12, by an investigation the worldBased on the “Uberfiles” publication by the International Consortium of Investigative Journalists (ICJI), how Wikipedia pages are related Start up Edited by Israt to reduce criticism. In addition to encyclopedias, a number of media are used by Istraat on behalf of this client.

In response to these attacks, administrators may decide to block an account temporarily, or after several warnings, to block it indefinitely. At the moment, our moderation strategies are the same: in the club, in case of repeated publication of posts contrary to the charter and after contacting the concerned blogger, we suspend their right to participate temporarily or permanently (depending on the seriousness of the affected).

And, as we’ve explained here and here , the use of a false identity now leads us to consider the content of the associated blog as “fake news.” We also recalled that, in the face of repeated manipulation of tickets, Mediapart’s graphic charter was developed to clarify the reading agreement and better separate the club space from the newspaper.

Alliance to protect our freedom

Before this confrontation with the disinformation industry, what we share with Wikipedia is perhaps a rejection of an elitist view of knowledge production. Thus, our Charter prohibits the dissemination of false news, but our moderation principles are far from being the only ones modeled on effective law.

In addition to our daily attention to published content, we encourage our readers to use the tab Our team reports this content », accessible to all blog posts. Using the same system as comments, this is contrary to our charter or participatory values ​​that alert those in charge of content moderation that we claim in our manifesto.

DProtecting freedom of expression entails our personal responsibility not to give an excuse [aux offensives conservatrices]Underlines the latter. Just as we at MediaPart defend quality, rigorous and professional journalism, we want to promote a demanding and beneficial debate against hateful profanity and fake news in its clubs and in its comment spaces.s » In practice, we often agree with the definition of moderation that La Quadrature du Net proposed in June 2019: “Its goal is not to remove content or people, but to reduce violence. (…) It recognizes our imperfections, our feelings but also our ability to reason (…) It allows us to work together. »

Without denying the risk of instrumentalization, the club maintains its ambition to be a place of inclusive expression where diverse, sometimes discordant perspectives are expressed, where stories and knowledge are shared that strengthen us. Because we believe in the public interest of these spaces, we decided to trust the collective intelligence of our community and its critical spirit. This is why we regularly interact with club moderation and comments.

The news shows once again that despite our vigilance and that of our readers, investigations will remain necessary to identify professional manipulators, then eliminate them from our participatory and collaborative spaces. Exchange of practice with Wikipedia reminds us that mutual aid is another of our weapons. And those alliances can strengthen our freedom.

Leave a Comment