The Digital Communities Lab will aim to achieve this research objective via two routes:

1

Scope of Principles

Human rights and fundamental freedoms must be maintained (upheld?) everywhere; they are no less important in the online environment than they are offline. Generally, governments are responsible for the safeguarding of human rights and fundamental freedoms; however, not only activities of governments impact human rights. On the internet, content platforms (“platforms”) create spaces, structures, and processes for interaction, education, work, and commerce, and they therefore play a crucial role in shaping the online environment and persons’ (and individual’s?) ability to enjoy their rights and freedoms. The role is even more important when a platform enjoys a monopoly position on (in) a given market. Notwithstanding the ( modern public square?)is role of platforms, national and international laws and national and international courts have not extended the responsibility for the maintenance of human rights to platforms.

Nonetheless, platforms should take an active role in preserving human rights and fundamental freedoms on the internet (in the online world). The responsibility for the safeguarding of human rights and fundamental freedoms on the internet may be extended to platforms through laws, including international laws. However, even when such laws do not exist, platforms should take active steps to preserve human rights – whether as a part of corporate social responsibility or as a matter of business ethics. Some platforms have already recognized the effects of their activity on human rights and fundamental freedoms, and all platforms should convert the recognition into actual steps in furtherance of the rights and freedoms on the internet.

Processes that are consistent with due process principles are cruically important for the maintaining of procedural fairness and substantive justice, and due process principles should permeate all activities of platforms. Some activities of platforms are regulated by law or by the courts and agency decisions, which should incorporate due process principles. Other activities of platforms might not be regulated by law or by the courts and agency decisions, and in such cases, it is up to the platforms to implement processes that safeguard due process principles.

Platforms should apply due process principles to all their decisions that affect their users and other parties. The importance of the maintenance of the principles have become increasingly significant as platforms have engaged in what has been perceived as the under-blocking and over-blocking of online content. Due process principles should assist in mitigating under- and over-blocking, and platforms should therefore apply due process principles when they moderate content, curate private data, make online disclosures, publish transparency statements, and conduct other online activities. The establishment of a proper balance among different fundamental rights, including but not limited to the freedom of expression, protection of personal data, and protection of intellectual property rights – as well as upholding the rule of law in preventing crime online – all depend on the proper application of due process in the online environment, and platforms must play an integral role in the maintenance of due process online.

Public discourse, freedom of expression and global online crime prevention are at present under extreme pressures, whether from either significant over-blocking or under-blocking by platforms. At the one end of the spectrum, user-generated content may often be subject to algorithmic content identification and blocked even absent any law violations. At the other end, clearly criminal content such as child pornography is not sufficiently blocked throughout the world. These extremes in content blocking may be caused by inadequate and unbalanced decision-making processes that platforms might utilize (apply/adopt?). Due process principles, including procedural fairness and natureal justice, should guide the design and implementation of platform processes so that human rights, fundamental freedoms, and the rule of law are restored and maintained in the online environment.

Additionally, concerted international efforts should be undertaken to coordinate on a global basis the principles for platforms’ decision-making processes concerning user-generated content. Such coordination, in the interest of legal certainty and justice, should preferably lead to the development of standardised procedures which are clear, accessible and user-friendly. Standardisation should also aim at improving transparency and accountability by producing reliable data to facilitate third-party scrutiny by regulators, NGOs, and other stakeholders.

As there is a broad spectrum of platforms, from small to large, in terms of their financial and technical resources and their capabilities and functionalities, the compliance with due process requirements should be assessed with a view to proportionality and reasonableness, taking into account the specific circumstances in any particular context.

2

General Principles of Platform Decision-Making Concerning User-Generated Content

Due process should be reflected in platforms’ decision-making concerning user- generated content and include among other principles:
1. a fair and public review within a reasonable time and by an independent and impartial decision-maker,
2. a proper prior notification of the parties,
3. an opportunity for the parties to respond and present evidence,
4. the right to legal representation,
5. the right to appeal to an internal appeals panel, alternative dispute resolution panel or competent court, and
6. the right to receive a decision in writing or other perceivable form and in a preservable format which clearly articulates the reason for the decision.

As part of the due process obligations, platforms should provide clear information to users about the platforms’ decision-making structures and procedures. Such information should include a simple, easy-to-understand notification to a user when content is taken down by a platform, together with clear reasons for the take- down and, when applicable, the legal basis for the take-down. Platforms should also provide a clear and simple outline for the process of challening the take-down, whether through a counter-notice or other equivalent means.

Users should be given an opportunity to request timely follow-up actions and timely decisions by platforms. The rules governing users’ actions (as well as rules and timelines on notice and take-down, submission of evidence, appeals procedures, and timelines for all decision-making processes) should be clear, publicly accessible, easy to understand, and applied in a timely manner.

Platforms should continuously evaluate whether their decision-making processes are designed and implemented adequately to reflect due process principles. The decision-making process must not be entirely automated; there should be provision for adequate and balanced input via human review, at least at the appeals stage. Platforms should provide for the proper, adequate, and professional qualification and for safety training for reviewers and other decision-makers.

Platforms should safeguard user privacy, including the principles of lawfulness, fairness, and transparency; limitation on purposes for data collection, processing, and storage; data minimization; accuracy of data; data storage limits; integrity, security, and confidentiality; and accountability. Where appropriate and even if not so required by law, platforms should make information available to users about the users’ data and provide for data rectification and deletion, data portability, and control of legacy accounts. Platforms should design their processes so that the processes are available only to complainants who themselves implement fair internal processes for notifications. To be effective, notifications must satisfy all procedural and substantive requirements. Complainants must play their part by implementing adequate internal processes, and this applies also to government agencies. For instance, a law enforcement agency using an expedited take-down process which is not subject to judicial or independent third-party oversight should implement internal due process safeguards to prevent the risk of false positives. Preventing false positives is the responsibility of complainants and platforms.

Independent oversight is important to ensuring compliance with due process principles. Platforms’ decision-making processes should not prevent a user from seeking judicial review by a court or an independent alternative dispute resolution body, whose procedures should safeguard due process principles.

3

Platform Decision-Making Concerning Online Criminal Content – Measures to Address Under-blocking

Platform decision-making concerning criminal content should include the following platform review and judicial review processes.

Platforms must comply with court and agency orders that require take-down of criminal content. Additionally and to the extent permitted by law, platforms should respond to other requests for take-downs of criminal content and for these requests platforms should establish prompt, transparent and efficient procedures for submitting and reviewing of criminal content notifications, and implement effective remedies.

Unless provided for by law, platforms should prescribe the content and form of a proper notification together with 1) applicable deadlines, 2) timelines for follow-up by platforms, 3) appropriate format for the submission of evidence, 4) a procedure for the removal of criminal content in a timely manner or, in some instances where warranted, on an expedited basis.

Platform decision-making concerning the removal of criminal content may include or provide for the following platform review and judicial review processes.

1. In urgent circumstances, platforms should make provision for the expedited removal of (serious/flagrant/gross) criminal content or content which involves the (serious/flagrant/gross) violation of human rights and which cause imminent harm. Platforms should remove or disable access to such content (as soon as is practically possible and on an urgent basis) upon receipt and review of the notification and/or removal request.

2. Expedited removal should follow an expedited review by the platform when so required by law, such as pursuant to court orders and/or requests by law enforcement authorities, or when requested by trusted parties, such as other relevant government agencies or well-established relevant non-governmental organisations (NGOs).

3. In other circumstances, platforms should make provision for the removal of criminal content or content which involves the violation of human rights in a timely manner. Platforms should remove or disable access to such content as soon as is reasonably possible upon receipt and review of the removal order or notification.

4. Due process must be maintained also at the source; therefore, prior to issuing a content removal request, complainants such as law enforcement authorities, government agencies and NGOs should establish that they have the appropriate legal basis such as probable cause or reasonable grounds to conclude that the notified activity or information is illegal.

5. Other than in cases when take-downs are mandated by law or court order and/or request by a law enforcement agency, platforms should limit or remove the ability to use the expedited and other procedures in cases of complainants who have consistently high false positive rates over a specified period of time and/or misuse the expedited procedures to suppress lawful expression.

4

Platform Decision-Making Concerning User-Generated Content – Measures to Address Overblocking

4.1 Overview

Platform decision-making concerning user-generated content should include the following platform review and judicial review processes.

Platforms must comply with laws and court and agency orders that require take-down of illegal content, such as copyright-infringing content. Additionally and to the extent permitted by law, platforms may moderate user-generated content either of their own accord or based on take-down requests and/or notifications, and in these circumstances, platforms should provide for prompt, transparent and efficient review of complaints that users submit against take-downs of their content, and implement effective remedies, including content restoration (put- back).

Platforms should establish a clear, simple and easy-to-understand procedure for notifying users about take-down requests and/or notifications that the platforms receive. Platforms should prescribe the content of a proper notification and counter-notice and establish procedures to handle unjustified take-down complaints. Procedural matters that platforms should prescribe should include a) notification and counter-notice deadlines, b) timely follow-up actions by platforms, c) an opportunity for a complainant to submit evidence in the appropriate format as specified by the platform and within a reasonable time frame, and d) appropriate and quick appeals procedures including an opportunity for the parties to be heard and resulting in a timely review decision. Platforms should not make it unduly difficult to provide evidence, nor should a platform establish stricter requirements than those envisaged by law.

Platform decision-making concerning the removal of user-generated content may include or provide for the following platform review and judicial review processes.

1. Even when they are not required to do so by law, platforms should, where appropriate, establish a notice-and-take-down procedure which allows for the notification and timely removal of law-violating content or content which involves the violation of a platform’s standards. Platforms should remove or disable access to such content (as soon as is reasonably possible) upon receipt and review of the notification. Platforms should provide for prompt, transparent and efficient review of take-down requests, and effective remedies, including, in appropriate circumstances, content labelling as an alternative to take-downs.

2. Platforms should notify users whose content is removed about the specific reasons for the removal. Platforms should also informed the users, in a simple and clear manner, of procedures, such as counter- notices, which the users may use to appeal the removal of their content. Such procedures should be easy-to-understand and user-friendly, and be outlined in simple terms.

3. Platforms’ and complainants’ decision-making concerning user- generated content shall not, in principle, prevent users from making available content which is lawful, including where such content is covered by freedom of expression, fair use, other exceptions or limitations to rights of others, and where such content does not violate a platform’s standards.

4. In order to avoid over-blocking and preserve human rights and fundamental freedoms, such as the freedom of expression and the freedom to conduct a business, governments should create a legal environment in which platforms are not liable for keeping content, other than (serious/flagrant/gross) illegal or infringing content, available until the legality assessment is completed.

4.2 Appeals Panel
Where a user or complainant files an appeal against a platform’s decision to remove or keep available content, the decision should be reviewed by a competent, independent and impartial decision-maker, such as an appeals panel, in a timely manner. On appeal, the appeals panel should assess the compliance of the decision with the platform’s standards. Appeals panel members should have adequate legal and professional training, accreditation and independent standing.

The lLaw may require a review by a competent alternative dispute resolution panel, administrative authority or a court – either in all cases or where when the party or parties are not satisfied with the decision of the appeals panel. Even when law does not so require -, for cases when party or parties are not satiosfied with the decision of the appeals panel -, platforms should facilitate parties’ access to appeal to a competent alternative dispute resolution panel, an administrative authority or a court.

4.3 Automated Review

In response to the volume and velocity of notifications and requests for removal, platforms may employ automated review as the first line of content review. They use automated removals, algorithmic take-downs and big data analytics to cope with the high volume of requests.

However, where automated reviews and take-downs do not adequately process urgent or complex cases or where disputes arise which require human review, platforms should establish a clear and simple procedure for complainants to request human review.

4.4 Human Review

In cases which require more urgent review or involve complex issues, such as the appropriate weighing of rights and interests, in particular human rights and fundamental freedoms, or where disputes arise over automated content removal decisions, requests for removal and complaints about removal decisions should be forwarded to a platform’s human review team in a timely manner. The human review team should assess whether the removal has been correctly carried out according to the platforms’ standards.

A clear and simple procedure should be made easily available for a timely appeal to an independent appeals panel in those cases where disputes have not been settled despite the review by a platform’s human review team.

Human review team members and appeals panel members should be sufficiently trained and prepared for their professional functions but especially for reviewing distressing content.

4.5 Algorithmic Enforcement

Governments should carefully weigh the benefits and downsides (drawbacks?) of algorithmic enforcement of rights when deciding whether and to what extent the use of algorithmic enforcement tools should be mandated. Accountability, transparency, and non-discrimination requirements should apply to platforms which employ such tools, whether to comply with law or voluntarily, including a) algorithm, training data and decision-making logic transparency, b) the application of human-in-command principles, c) external audits, d) liability and redress for AI- generated harm, and e) transparency reports.

4.6 Full Disclosure and Transparency Principles

Platforms should provide clear information to users about the platforms’ standards and procedures. The criteria that the platforms use for removing content or for blocking content should be made clear in a transparent manner. Where a user is affected by a platform decision, platforms should inform users about the reasons for the decision, including which law and/or platform standards the content violated and how the manual or automated processes were used to identify the violation.

Platforms should also publish general information about their content moderation practices to enable regulators, government bodies, NGOs and other stakeholders to understand these processes and hold platforms and complainants accountable.

Additionally, in the interest of transparency, full disclosure by platforms should include a regular publication of the following information: a) posts removed, b) accounts banned / accounts suspended. c) nature of the complaints, and d) if and where appropriate, false positive rates (if take-down request or automated removal decision was successfully challenged).

5

Digital Literacy

To facilitate users’ awareness of and effective involvement in the procedures above, platforms should contribute towards efforts to improve online literacy. For example, platforms may raise awareness of due process mechanisms and assist users in learning how to use online tools to confirm the authenticity of products and offerings online and the accuracy of online content.

When designing policies and procedures to implement due process principles, platforms should take into account any potential barriers to access that users may experience, including varying degrees of online literacy of their users. Platforms should ensure that such policies and procedures are accessible to all users, including to members of systematically excluded groups and users with disabilities.

The Digital Scholarship Institute