Indonesia’s new UGC platform regulation evaluated against international standard

by | Jul 26, 2020 | Free Speech, Open Blog | 0 comments

 
*I was invited to speak at “Digital Week 2020: Shaping the Digital Economy Through Public-Private Dialogue” to bring to light on some best practices on intermediary liability and safe harbour protection in relation to Indonesia‘s digital economy together with the country’s top official on the subject matter on Wednesday, July 22 2020 at 10:00 (GMT+7), as follows:
 
Webinar #2: User-Generated Content and Platform Governance in the Digital Society
Moderator: Rainer Heufers, Executive Director, Center for Indonesian Policy Studies
– Damar Juniarto (Executive Director, SafeNET)
– Jean-Jacques Sahel (Government Affairs & Public Policy, Google Asia Pacific)
– Prof Kyung-Sin Park (Law Professor, Korea University Law School)
– Semuel Abrijani Pangerapan (Director General of Informatics Applications, Minister of Communications and Informatics)
 
This is special meaning for me because I spoke at an event about 2 years ago when Indonesia’s intermediary liability rule was in its infancy in the form of a Circular.  Two years later, the government now upgraded it into a full regulation, which has the following elements based on the publicly available draft, and extended its coverage to include all user-generated contents platforms, not just the ones mediating the sale of goods as the former Circular did:
  • Notice to UGC creator if there’s a removal request against their content (Article 15.3.c)
  • No notice to platforms before administrative sanction and blocking of access due to pornographic and gambling content (Article 16.1)
  • 1×24 hours Turnaround time for removal of pornographic/gambling content (Article 16.3) otherwise imposition of fines and/or blocking of access  to the platform at ISP level
  • Private Scope ESPs are required to prevent access to illegal EI/ED content or content troubling the community and disturbing public order (Article 13.2)
  • Urgent request to remove content within 2 hours is applicable for terrorism and content that causes public unrest and disorder (Article 20.2)
  • If a platform does not comply with a removal notice within the specified deadline of 48 hours, Minister may oblige ISPs to block access to the service (Art 21.7) and shall impose a fine (Art 21.10) that would accumulate every 24 hours until compliance, up to a maximum of 3 times (Art 21.11)
  • ESPs are obliged to provide the Ministry ‘in the event of supervision’ or law enforcement ‘in the event of law enforcement, according to the laws and regulations’ with access to electronic systems and electronic documents (Article 29)

The Internet’s value to people relies the intermediary liability safe harbor, an international human rights rule that the intermediaries shall not be held liable for contents that they do not know about, otherwise they will engage in prior censorship or “general monitoring”, leaving only those contents explicitly or implicitly approved by intermediaries and therefore turning the online space into a gated community controlled by the gatekeepers. In this setting, it may look innocuous to hold intermediaries liable for contents they DO KNOW the existence of or they are notified of by private parties or government bodies. However, if such liability is not limited to the contents they KNOW not just the existence of but also the illegality of, it will create many false positives because intermediaries will rather err on the side of deleting regardless of their ultimate legality.  The international standard is a liability-exempting rule that affirmatively exempts intermediaries for unknown contents, and its purpose of protecting the online space will be frustrated by a liability-imposing rule that on the surface imposes intermediary liability for the contents that intermediaries do know the existence of.  If at all, such liability-imposing rule should be limited to only those contents that intermediaries know BOTH the existence and illegality of the contents.  Or not having such liability-imposing rule at all is also fine since the intermediaries will still be held liable under general joint liability for torts or general accomplice liability for criminal activities with respect to the contents that they are aware of both existence and illegality of the contents.

The new regulation still suffers from the same defect as the 2016 Circular in that it imposes liability for not removing illicit content in a short amount of time. Platform operators will be forced into taking many lawful contents down.  This will hurt the country’s digital economy because this liability rule is difficult to enforce against foreign platforms and will restrict only domestic (Indonesian) platforms in competition.  There will be migration of users from domestic platforms to foreign ones.  This is exactly what happened in Korea over past 10 years.  The only way to enforce the rule against foreign platforms is to block their URLs but such blocking is almost always excessive, given that the vast majority of the contents on such platform is almost always lawful ones.  The only way to enforce the rule against foreign platforms is to block their URLs but such blocking is almost always excessive, given that the vast majority of the contents on such platform is almost always lawful ones.

In going forward, please pay attention to the French Constitutional Council’s recent decision on Avia Law in June 2020, which is exactly on point in evaluating the Indonesian regulation.

  Article 1, paragraph I of Avia Law would have amended an existing piece of legislation, Law №2004–575 of 21 June 2004 on confidence in the digital economy (loi n° 2004–575 du 21 juin 2004 pour la confiance dans l’économie numérique). Under the existing law, the French administrative authorities have the power to direct any online service provider to remove specified pieces of content which they consider to constitute (1) content which glorifies or encourages acts of terrorism, or (2) child sexual abuse imagery, supported by criminal sanctions. Where the online service fails to do so within 24 hours, or if the authority is not able to identify and notify the responsible online service, the authority can request ISPs and search engines to block access to the web addresses containing the content in question.

The new law (Article 1, paragraph I) of the Avia law would reduce the time period from 24 hours to 1 hour, and increases the potential sanctions for failure to comply to one year’s imprisonment and a fine of 250,000 euro (up from the current sanction of one year’s imprisonment and a fine of 75,000 euro).

The same law in paragraph II would also have amended the Law №2004–575 of 21 June 2004 on confidence in the digital economy by introducing a brand new regime to complement the existing one. Under this new regime, online service providers whose activity exceeded a particular size (to be set down in a decree) would have a new legal obligation to remove or make inaccessible certain forms of “manifestly illegal” content within 24 hours of being notified by one or more persons, not just administrative bodies. These forms of illegal content are broader than paragraph I’s terrorism glorification or child sexual abuse material, and include contents (1) condoning crimes, (2) encouraging discrimination, (3) denying or trivializing crimes against humanity, of slavery, of war, etc., (5) insulting persons due to sex, etc., or sexually harassing others, (7) pornographic representing children (8) encouraging terrorism (9) disseminating pornography to minors

The Constitutional Council, in striking down the 1-hour administrative censorship on terrorism/child pornography portion of Avia law focused on, among other things, the fact that the operative legality of the content was solely determined by administrative authorities, and given the short-time limit, the online service providers were unable to obtain a decision from a judge before having to remove content.

As to the 24-hour private notice-and-takedown component, the Constitutional Council noted that “the decision as to whether the content is “manifestly illegal” or not is not one that a judge makes, but solely down to the online service provider [when] the decision may be a difficult one involving very technical legal issues, . . .particularly on crimes related to the press.”

Overall, the Constitutional Council’s decision is primarily concerned with “false positives”, e.g., takedown of lawful contents whether administratively triggered or privately triggered.

The reasoning of the Constitutional Council is important, since a number of governments around the world are looking at introducing legislation which would impose obligations on social media platforms, search engines and other online service providers to take steps to limit the availability of different forms of illegal (and, in some cases, legal but harmful) content. Indonesian regulation is similar to those in the Avia law, particularly obligations to remove content notified by an administrative authority or private persons, and to remove certain forms of content within a specified period of time (whether notified of their existence or otherwise) with sanctions for failing to do so.

Also, Indonesian regulation’s proposal of no-notice takedown duties for pornographic and gambling content directly violates the international standard of intermediary liability safe harbor, in that it holds platform operators liable for existence of contents regardless of their knowledge of the contents.  This is equivalent to “general monitoring” obligations warned against in the EU e-commerce directive.

 

 

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *