As false or manipulated information continues to proliferate online during the Covid-19 epidemic, the Forum on Information and Democracy is publishing a report entitled How to end infodemics. Based on more than 100 contributions from international experts, it offers 250 recommendations on how to rein in a phenomenon that threatens democracies and human rights, including the right to health.
Launched in 2019 by 11 non-governmental organizations and research centres, the Forum on Information and Democracy created a working group on infodemics in June to devise a “regulatory Read the reportHEREframework” to respond to the information chaos on online platforms and social media. After five months of work, this group, whose steering committee is co-chaired by Maria Ressa and Marietje Schaake, is publishing a detailed report with 250 recommendations for governments and digital platforms.
The report, written by a team of rapporteurs led by Delphine Halgand-Mishra, identifies four structural challenges and proposes concrete solutions for each of them:
- platform transparency
- content moderation
- promotion of reliable news and information
- private messaging services
Many countries that are members of the Alliance for Multilateralism expressed their support when the Forum’s president, Christophe Deloire, gave a presentation about the working group to nearly 50 foreign ministers during an Alliance meeting on 26 June that was also attended by World Health Organization director-general Tedros Adhanom Ghebreyesus and UNESCO director-general Audrey Azoulay
During another meeting of the Alliance to be held on 12 November as part of the Paris Peace Forum, Deloire will give a presentation on the Forum on Information and Democracy report and its main recommendations to Alliance foreign ministers.
“This report is proof that a structural solution is possible for ending the information chaos that poses a deadly danger to our democracies,” Christophe Deloire said. “All those adopting legislative initiatives with regard to platforms should be guided by this report, whether in India with Section 79, the United States with Section 230, Canada with the Digital Charter, the United Kingdom with the Online Harms Bill and, of course, the European Union with the Digital Services Act.”
"It's been an honor to work with experts across many disciplines - exactly what is needed today, says Maria Ressa, co-chair of the steering committee. These times show more than ever that information is power, and when lies spread faster than facts, all human endeavor is threatened. It's an existential moment for democracy and journalism. This is a concrete step forward to find systemic global solutions."
“Democracy is under threat and the lack of trust or outright manipulation increasingly has an information component, explains Marietje Schaake, also co-chair of the steering committee. Governance of our digital world must be wrestled back from private companies and authoritarian states alike if democracy is to survive. Democratic leaders must take their responsibility to preserve democracy, human rights and fundamental freedoms now.”
The twelve main recommendations of the working group
PUBLIC REGULATION IS NEEDED TO IMPOSE TRANSPARENCY REQUIREMENTS ON ONLINE SERVICE PROVIDERS.
- Transparency requirements should relate to all platforms’ core functions in the public information ecosystem: content moderation, content ranking, content targeting, and social influence building.
- Regulators in charge of enforcing transparency requirements should have strong democratic oversight and audit processes.
- Sanctions for non-compliance could include large fines, mandatory publicity in the form of banners, liability of the CEO, and administrative sanctions such as closing access to a country’s market.
A NEW MODEL OF META-REGULATION WITH REGARDS TO CONTENT MODERATION IS REQUIRED.
- Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law: legality, necessity and proportionality, legitimacy, equality and non discrimination.
- Platforms should assume the same kinds of obligation in terms of pluralism that broadcasters have in the different jurisdictions where they operate. An example would be the voluntary fairness doctrine.
- Platforms should expand the number of moderators and spend a minimal percentage of their income to improve quality of content review, and particularly, in at-risk countries.
NEW APPROACHES TO THE DESIGN OF PLATFORMS HAVE TO BE INITIATED.
- Safety and quality standards of digital architecture and software engineering should be enforced by a Digital Standards Enforcement Agency. The Forum on Information and Democracy could launch a feasibility study on how such an agency would operate.
- Conflicts of interests of platforms should be prohibited, in order to avoid the information and communication space being governed or influenced by commercial, political or any other interests.
- A co-regulatory framework for the promotion of public interest journalistic contents should be defined, based on self-regulatory standards such as the Journalism Trust Initiative; friction to slow down the spread of potentially harmful viral content should be added.
SAFEGUARDS SHOULD BE ESTABLISHED IN CLOSED MESSAGING SERVICES WHEN THEY ENTER INTO A PUBLIC SPACE LOGIC.
- Measures that limit the virality of misleading content should be implemented through limitations of some functionalities; opt-in features to receive group messages, and measures to combat bulk messaging and automated behavior.
- Online service providers should be required to better inform users regarding the origin of the messages they receive, especially by labelling those which have been forwarded.
- Notification mechanisms of illegal content by users, and appeal mechanisms for users that were banned from services should be reinforced.
Reporters Without Borders (RSF)