Summary of K.S. Park’s remarks:
1. Algorithms have been criticized as the facilitator of disinformation. However, the filter bubbles and echo chambers effect that algorithms create are also used by good-doers and activities to spread their ideas. It is not this amplifying effect that we should be focused on as long as the users are given a choice to whether to turn the recommending feature on or off. We should respond technology with technology. Algorithms can be deployed to flag disinformation and we humans can control which parameters are entered into to decide which postings constitute disinformation.
2. Disinformation strategy must be compliant with international human rights standards concerning “fake news” crimes, which means information should be not be suppressed or punished for falsity but for its likelihood to cause harms. Framed that way, disinformation strategy must be informed by hate speech discourse because much of the harms caused by disinformation arise from false information spread by the ruling groups or the ruling government against the socially vulnerable. Especially, state-sponsored disinformation looms much larger in terms of harms than privately generated disinformation.
3. Germany’s NetzDG is being copycatted in Southeast Asia but with much worse applications. The idea that disinformation can be somehow combatted by holding platforms liable for not taking down notified content quickly has generated a series of “anti-fakenews” legislations in Southeast Asia constituting a strict liability regime where platforms are held liable for failure to take down unlawful content regardless of whether platforms could not know that the contents were unlawful. Under this scheme, platforms are erring on the side of taking down all contents notified by government or private complainants even if lawful because the penalty of false retention of the content is just too severe.
On the surface, Germany’s NetzDG does not violate international human rights standard of intermediary liability since liability attaches only to notified contents. However, there has been a grey area whether platforms must have knowledge only of existence of unlawful content or both existence and illegality of the content. If just existence suffices, i.e., strict liability applies to failure to take down notified content that turns out to be unlawful, then platforms have strong incentives to take down all notified contents, causing overbroad censorship of many lawful contents.
CDA Section 230 has been criticized for tolerating the platform operators knowingly refusing to take down harmful content. Those cases can be captured by general tort liability, not by imposing NetzDG-type of regulation that sets up a strict liability scheme.
* Event information
The purpose of the workshop is to provide advice to the Special Rapporteur on the follow up of her recent report to the UN Human Rights Council and to make suggestions for her next report. In the first part of the meeting the participants will be invited to discuss the report on disinformation and make suggestions on how best to take forward the recommendations. The second part of the meeting will be a brainstorming session on media freedom, with a view to helping the Special Rapporteur to identify issues that could form the focus of her next report to the Human Rights Council. The meeting will be held under Chatham House Rule (no public attribution of speakers).
The workshop is jointly organized by the Office of the High Commissioner for Human Rights and the Albert Hirschman Centre for Democracy at the Graduate Institute.
BACKGROUND MATERIALS:
Presentation of disinformation report to the Human Rights Council (interactive dialogue)
UNSR Report on gender justice A/76/258 (optional reading for additional context)
PROGRAMME:
3:00-3:10 Setting out the objectives of the meeting
Opening remarks by Irene Khan
3:15-4:15 Comments and feedback on the Special Rapporteur’s report on disinformation and freedom of expression
- Which conclusions and/or recommendations do you consider to have the highest potential impact? Which conclusions/recommendations would benefit from further refining?
- How would you recommend that the Special Rapporteur takes this work forward during her mandate?
- What further recommendations, measures and initiatives could be developed by the Special Rapporteur on disinformation? What partnership possibilities should she consider to advance her work in this area?
4:15-4:30 Break
4:30-5:45 Input on forthcoming report on media freedom in the digital age
- What are the most important or pressing threats and challenges to media freedom in the digital age?
- Which issues in relation to the legacy media would benefit from the attention of the Special Rapporteur? Some possible issues for discussion are:
○ Role of legacy media in contributing to or countering misinformation and disinformation
○ Digital threats to media freedom, e.g. electronic surveillance of journalists and their sources
○ Relevance of media freedom to sustainable development
○ Threats to investigative journalism
○ Media’s role in advancing gender equality
5:45-6:00 Wrap up and conclusions
Closing remarks by Irene Khan
0 Comments