Platform decision-making concerning user-generated content should include the following platform review and judicial review processes.
Platforms must comply with laws and court and agency orders that require take-down of illegal content, such as copyright-infringing content. Additionally and to the extent permitted by law, platforms may moderate user-generated content either of their own accord or based on take-down requests and/or notifications, and in these circumstances, platforms should provide for prompt, transparent and efficient review of complaints that users submit against take-downs of their content, and implement effective remedies, including content restoration (put- back).
Platforms should establish a clear, simple and easy-to-understand procedure for notifying users about take-down requests and/or notifications that the platforms receive. Platforms should prescribe the content of a proper notification and counter-notice and establish procedures to handle unjustified take-down complaints. Procedural matters that platforms should prescribe should include a) notification and counter-notice deadlines, b) timely follow-up actions by platforms, c) an opportunity for a complainant to submit evidence in the appropriate format as specified by the platform and within a reasonable time frame, and d) appropriate and quick appeals procedures including an opportunity for the parties to be heard and resulting in a timely review decision. Platforms should not make it unduly difficult to provide evidence, nor should a platform establish stricter requirements than those envisaged by law.
Platform decision-making concerning the removal of user-generated content may include or provide for the following platform review and judicial review processes.
1. Even when they are not required to do so by law, platforms should, where appropriate, establish a notice-and-take-down procedure which allows for the notification and timely removal of law-violating content or content which involves the violation of a platform’s standards. Platforms should remove or disable access to such content (as soon as is reasonably possible) upon receipt and review of the notification. Platforms should provide for prompt, transparent and efficient review of take-down requests, and effective remedies, including, in appropriate circumstances, content labelling as an alternative to take-downs.
2. Platforms should notify users whose content is removed about the specific reasons for the removal. Platforms should also informed the users, in a simple and clear manner, of procedures, such as counter- notices, which the users may use to appeal the removal of their content. Such procedures should be easy-to-understand and user-friendly, and be outlined in simple terms.
3. Platforms’ and complainants’ decision-making concerning user- generated content shall not, in principle, prevent users from making available content which is lawful, including where such content is covered by freedom of expression, fair use, other exceptions or limitations to rights of others, and where such content does not violate a platform’s standards.
4. In order to avoid over-blocking and preserve human rights and fundamental freedoms, such as the freedom of expression and the freedom to conduct a business, governments should create a legal environment in which platforms are not liable for keeping content, other than (serious/flagrant/gross) illegal or infringing content, available until the legality assessment is completed.
4.2 Appeals Panel
Where a user or complainant files an appeal against a platform’s decision to remove or keep available content, the decision should be reviewed by a competent, independent and impartial decision-maker, such as an appeals panel, in a timely manner. On appeal, the appeals panel should assess the compliance of the decision with the platform’s standards. Appeals panel members should have adequate legal and professional training, accreditation and independent standing.
The lLaw may require a review by a competent alternative dispute resolution panel, administrative authority or a court – either in all cases or where when the party or parties are not satisfied with the decision of the appeals panel. Even when law does not so require -, for cases when party or parties are not satiosfied with the decision of the appeals panel -, platforms should facilitate parties’ access to appeal to a competent alternative dispute resolution panel, an administrative authority or a court.
4.3 Automated Review
In response to the volume and velocity of notifications and requests for removal, platforms may employ automated review as the first line of content review. They use automated removals, algorithmic take-downs and big data analytics to cope with the high volume of requests.
However, where automated reviews and take-downs do not adequately process urgent or complex cases or where disputes arise which require human review, platforms should establish a clear and simple procedure for complainants to request human review.
4.4 Human Review
In cases which require more urgent review or involve complex issues, such as the appropriate weighing of rights and interests, in particular human rights and fundamental freedoms, or where disputes arise over automated content removal decisions, requests for removal and complaints about removal decisions should be forwarded to a platform’s human review team in a timely manner. The human review team should assess whether the removal has been correctly carried out according to the platforms’ standards.
A clear and simple procedure should be made easily available for a timely appeal to an independent appeals panel in those cases where disputes have not been settled despite the review by a platform’s human review team.
Human review team members and appeals panel members should be sufficiently trained and prepared for their professional functions but especially for reviewing distressing content.
4.5 Algorithmic Enforcement
Governments should carefully weigh the benefits and downsides (drawbacks?) of algorithmic enforcement of rights when deciding whether and to what extent the use of algorithmic enforcement tools should be mandated. Accountability, transparency, and non-discrimination requirements should apply to platforms which employ such tools, whether to comply with law or voluntarily, including a) algorithm, training data and decision-making logic transparency, b) the application of human-in-command principles, c) external audits, d) liability and redress for AI- generated harm, and e) transparency reports.
4.6 Full Disclosure and Transparency Principles
Platforms should provide clear information to users about the platforms’ standards and procedures. The criteria that the platforms use for removing content or for blocking content should be made clear in a transparent manner. Where a user is affected by a platform decision, platforms should inform users about the reasons for the decision, including which law and/or platform standards the content violated and how the manual or automated processes were used to identify the violation.
Platforms should also publish general information about their content moderation practices to enable regulators, government bodies, NGOs and other stakeholders to understand these processes and hold platforms and complainants accountable.
Additionally, in the interest of transparency, full disclosure by platforms should include a regular publication of the following information: a) posts removed, b) accounts banned / accounts suspended. c) nature of the complaints, and d) if and where appropriate, false positive rates (if take-down request or automated removal decision was successfully challenged).