Platform accountability in South Asia & SEA – APrIGF2024.

by | Sep 3, 2024 | Free Speech, Open Blog, Open Seminar | 0 comments

On August 21, 2024, K.S. Park, Director, Open Net, spoke at the above titled session at APrIGF.

Accountability of social media and user-to-user communication platforms (“platforms”) around their speech and privacy practices is a longstanding, albeit contested, issue. Countries have adopted different approaches to holding platforms accountable— from legislation (Singapore) to coalition-led voluntary commitments (Australia). On the other hand, India has taken platforms to court, demanding more transparency and safety against hate speech for their communities. Efficacy measurements around these approaches are fragmented, therefore, it is difficult to identify what works and doesn’t, and subsequently, how to make more concerted efforts based on learnings. This panel discussion brings together experts from Bangladesh, Nepal, Philippines and South Korea to understand the varied approaches to accountability and making online spaces safer, discuss what works, and provide recommendations on specific interventions that civil society and policymakers can adopt based on shared, cross-regional experiences.

The panel discussion is built on a series of civil society-led conversations in the Asia Pacific and the Global Majority that increasingly recognised that accountability efforts are fragmented. As a result, the uneven power dynamic between private entities, policymakers and non-governmental voices is exacerbated, and communities are left unsafe and unheard. There is a growing urgency to de-duplicate efforts and share learnings, ideally contributing towards coalition-building in APAC and globally. The panel is both a culmination of these discussions to bring in a wider audience, as well as a starting point towards more cross-regional learning and coalition-building.

Moderator:

  • Liza Garcia, Executive Director, Foundation for Media Alternatives

Speakers:

  • Sabhanaz Rashid Diya, Founder and Executive Director, Tech Global Institute
  • KS Park, Founder, Open Net Korea
  • Prateek Waghre, Executive Director, Internet Freedom Foundation (Invited)
  • Grace Salonga, Executive Director, Movement against Disinformation

K.S. Park spoke about the need to study DSA as a viable framework for platform accountability. DSA does not impose liability for specific content, removing the risk of turning into censorship. NetzDG imposes liability for failure to take down specific content. That liability scheme was hijacked by authoritarian governments in Southeast Asia into extreme censorship measures, for instance, with the deadline to take down content within 1 day, 4 hours, and sometimes within 1 hr. With such stringent time constraint, platforms do not have time to determine carefully the legality of the content. So, if they decide not to take down the content because they are not sure of its illegal nature and therefore are held liable, they will be held liable for the contents that they are not aware of. Imposition of liability for contents unaware of is exactly the evil that E-Commerce Directive 2000, CDA 230, DMCA 512, Japanese Provider Liability Limitation Act, and Indian Intermediary Liability Rule all belabored to avoid lest the platforms engage in prior censorship, general monitoring, or simple shutdown of the space.

The Preamble of DSA clarifies that DSA is not about imposing liability on platforms for content but about how to exempt themselves from liability: “The rules on liability of providers of intermediary services set out in this Regulation should only establish when the provider of intermediary services concerned cannot be held liable in relation to illegal content provided by the recipients of the service. Those rules should not be understood to provide a positive basis for establishing when a provider can be held liable, which is for the applicable rules of Union or national law to determine.”

The above clarification makes sure that the platforms do not become vending machines where anyone can bring down any undesirable posting by simply sending notifications using the platforms’ tendency to err on the side of removing as opposed to retaining the postings. “Notice liability”, like what NetzDG created, is not created by DSA.

DSA is not without its weakness. DSA privileges the governments as notice-givers because it does allow administrative bodies (as well as judicial bodies of course) to issue take-down orders. Contrary to the clarification above, enforcement of the takedown order will inevitably impose liability on platforms for failure to take down specific contents. Although there is no provision to that effect, the provision authorizing issuance of injunctions will be very well read to include authorization of liability imposition. Also, DSA empowers the government to select “trusted flaggers” whose notice will be processed as a matter of “priority”. There can be very controversial political fights on who can be “trusted flaggers” and acquire a clout in influencing the content ecosystem.

DSA will present an interesting opportunity for Asia both to learn and establish the international standard of intermediary liability safe harbor while instituting platform accountability.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *