In this blogpost, you will find the link to my final submission to the ICTA on its proposed amendments to the ICT Act. In the last section of my paper, I include my answers (reproduced below) to the specific questions of the ICTA in its Consultation Paper.
Summary of questions being released for public consultation
14.1 What are your views on the present approach of self-regulation of social networks by social media administrators themselves where they decide to remove an online content or not based on their own usage policy and irrespective of your domestic law?
Countries around the world face issues concerning circulation of posts which are potentially in breach of their domestic laws on social media platforms. There is, at present, no fully satisfactory response which is proposed or effectively deployed in any democratic country. Only non-democratic countries have recourse to drastic measures aiming to block and/or intercept all of their own citizens’ online communications and social media traffic in an attempt to regulate the same. As a democratic country, Mauritius cannot use methods which would be more suitable for non-democratic regimes.
It is true that the content being circulated online which either targets or is created by Mauritian citizens on social media may be in breach of domestic laws. However, an objective assessment of the extent of such illicit content being circulated needs to be conducted to determine the extent of abuse and/or misuse as already specified in the above paper,
It should also be recalled that social media platforms offer various levels of privacy, meaning that one may categorise the online communication sphere created by these into multiple sub-categories, which can tentatively be broadly listed as follows:
- Online national public sphere created by public personalities (prominent members of society such as politicians, leaders of big organisations, community and religious leaders, opinion leaders, etc.) and organisations (whether public or private bodies) who decide to publish their posts on the full “public mode” level. Their content becomes accessible to anyone without the need to be directly connected as “friends” or “friends of friends”. They generally have a large number of followers/friends and their posts can be shared, thereby enabling them to become viral.
- Targeted public circles created by specific individuals and bodies who wish to communicate within a semi-restricted sphere, upon invitation.
- Private circles whereby an individual or entity communicates only with their friends and whose posts cannot be shared outside of the network of friends.
There are obviously more levels of control which are generally available on some social media platforms in between those three broad categories. Suffice to say that the first level (online national public sphere) is the one which should command the most attention, followed by the second level (targeted public circles) whereas the third level (private circles) may be considered the equivalent of private conversations between private individuals.
Individuals and entities who have large follower bases in the online national public sphere and targeted public circles are the ones who should be more subject to scrutiny as they have the potential for virality and their speech is tantamount to public speech, which may be evaluated against prevailing domestic laws.
Intense debates have taken place after the occurrence of major incidents linked to social media accounts of public figures such as former US President Donald Trump. The latter’s account was shut down on platforms such as Twitter and Facebook after the Capitol invasion in January 2021 when he lost elections. Despite multiple posts which contained fake news and racist comments during his presidency, he was only banned from social media platforms when he lost the last elections, thus sparking debates about whether all major public figures around the world would henceforth be liable to similar treatment by the platforms. This was one of the most prominent cases entrusted by Facebook to its own Oversight Board for review. The Facebook Oversight Board published its ruling on 5th May 2021 and upheld the decision but requested that Facebook review the decision within the next six months and also develop clear, necessary, and proportionate policies that promote public safety and respect freedom of expression.
Thus, it can be deduced that platforms are not monolithic organisations and they are prone to debates both on the internal and external fronts. All parties involved still fray with the delicate issue of regulation and remain open to reviews and possible disruption of their moderation policies and mechanisms.
These mechanisms remain unsatisfactory and will probably never ever be satisfactory due to the complexity involved whereby the very same platforms can be used:
- to mobilise for good causes or incite bad mass behaviour,
- to communicate on different levels such as one-to-one, one-to-few, few-to-few, one-to-many or many-to-many.
Regulating social media is not an easy task and a simple one-size-fits-all solution will simply not work without creating bigger prejudices.
14.2 Do you think that the damage caused by the excesses and abuses of social networks to social cohesion warrants a different approach from the self-regulatory regime presently being enforced by social media administrators themselves?
There is not enough data which has been made available about the damage caused by the excesses and abuses of social networks to social cohesion in Mauritius to be able to formulate an informed reply to this question. The data remains impressionistic rather than scientific. For instance, how many such cases have there been over the last years and months that have not been detected and tackled through the current investigation and legal system? What have been the trends in the number of such cases? What is the exact breakdown? What have been the actions taken by authorities? In what way have they been insufficient? To date, we have no knowledge of massive excesses that have created havoc in the social tissue of our country. There have been no major riots nor major ethnic conflicts which have been reported by the Mauritian police over the past few days that we know of.
14.3 What are your views on the overall proposed operational framework in terms of the
• National Digital Ethics Committee (NDEC)
• Enforcement Division
which is intended to bring more clarity to section 18 (m) of the ICT Act, where the ICTA is mandated to take steps to regulate or curtail the harmful and illegal content on the Internet and other information and communication services.
Please refer to comments made in the paper submitted above.
14.4 What are your views on the proposed legal amendments to the ICT Act to give legal sanctity and enforcement power to the NDEC?
This is a risky proposal as the composition of such a committee will inevitably be reviewed each time a new political party or coalition gains majority in the National Assembly to constitute government. Such powers should rest only within the justice system to ensure impartiality and independence from political power games.
14.5 What are your views on the proposed modus operandi of the NDEC?
There is scant information provided in the consultation paper about the modus operandi of the proposed NDEC. It merely stipulates that “it is proposed that the Chairperson and members of the NDEC be independent, and persons of high calibre and good repute.”
There are no indications whatsoever about the following:
- Number of members
- Composition of the committee (profiles of members)
- Process for nomination of members
- Duration of mandate for members
- Process for evaluating complaints
- Existence of a review or appeal systems (as opposed to the justice system where the different appeal levels are clearly defined by law)
- Proposed stages of penalty or fines (as opposed to the justice system where maximum sentences and fines are clearly defined by law)
14.6 What are your suggestions on the safeguard measures to be placed for the NDEC?
No specific safeguard measures for the NDEC have been proposed in the consultation paper. The paper merely stipulates that:
“Before start of operations, the NDEC will also be tasked to come up with sufficient and effective safeguards to be published in order to ensure complete operational transparency and avoidance of any abusive use and misuse of this operational framework.”
First and foremost, the nature of the committee itself is problematic and one cannot expect the committee to define its own safeguards, which would be the equivalent of a watchdog being asked to watch over itself.
14.7 What are your views on the use of the technical toolset, especially with respect to its privacy and confidentiality implications when enforcing the mandatory need to decrypt social media traffic?
As stated by major browsers namely Google and Mozilla in their joint submission, the proposed technical toolset is very dangerous as it would undermine the privacy of Mauritian citizens and put them at risk of attacks by third parties if it does work. Browser makers may decide to block the system. The proxy server may be clogged down by the volume of traffic to be handled and may even be targeted by cyber attacks. As a result, access to the internet may be slowed down or even paralysed for all in Mauritius. This would not only impact citizens but also foreigners communicating and transacting with people based in Mauritius as well as local and international corporate entities.
This toolset can set us back completely as an aspiring tech hub. We may lose business and partnership opportunities, incoming FDI, inbound tourists, and existing foreign companies may prefer to move their branches or headquarters to other countries.
If this toolset is implemented, Mauritius will also undoubtedly be downgraded in international indices such as Freedom House, Reporters Without Borders, the ITU ICT Development Index, the Global Innovation Index or even the Ease of Doing Business index, inter alia. The repercussions for the economy may be far-reaching. Many international organisations have already expressed their concern and opposed the proposed amendments to the ICT Act.
14.8 Can you propose an alternative technical toolset of a less intrusive nature which will enable the proposed operational framework to operate in an expeditious, autonomous and independent manner from the need to request technical data from social media administrators?
No technical toolset can be proposed at the local level. Solutions should preferably be sought in consultation with the social media platforms and the legal and judiciary should be better equipped to deal with complaints related to social media.
In the first case, diplomatic negotiations should be leveraged with regional organisations such as the African Union, SADC and COMESA or international partnerships such as with fellow Commonwealth and Francophone allies and SIDS (Small Island Developing States).
When reinforcing our legal capacity to tackle infringements to domestic laws on social media, the investigative teams within the police force also need to be provided with regular training as technological products and services evolve quickly.
More importantly, if abuse and misuse of social media in breach of domestic laws are to be tackled in a sustainable manner, it is very important to invest in nation-wide as well as targeted campaigns for:
- digital literacy,
- media and information literacy,
- civic education.
14.9 Should the Courts be empowered to impose sentences (which include banning use of social media) on persons convicted of offences relating to misuse of social media tools?
The responsibility of adjudicating sentences should indeed remain within the judiciary system of Courts as learned magistrates and judges are trained professionals for adjudicating sentences.
The investigative teams within police departments concerned should also be reinforced through capacity building.
Summary of key points in the body of my Analytical Paper
- It is a positive point that the ICTA has published this as a Consultation Paper and extended deadline for submission of comments.
- ICTA states that social media plaforms take too much time to respond to their requests, that community standards are not as strict as nor necessarily compliant with our domestic laws, and that they may not be able to moderate content adequately as they do not understand our Mauritian Creole language. For the ICTA, a “minority of individuals or organised groups” are at fault and “The issue at hand is when these abuses, even though perpetrated by few individuals/groups, go viral, the damage created is very far reaching.”
- Before proposing any solution, the exact nature of an existing problem needs to be examined in detail to answer the question “How big is the problem of misuse and abuse of social media really in the country that warrants such far-reaching measures as those being proposed by the ICTA in its consultation paper?”
- This should include detailed information about the “very far reaching damage” created. The consultation paper does include a table listing the number of incidents reported to the MAUCORS but lacks information about the current investigation process, actions taken by authorities, whether dealt with locally by law or reported to platforms, and results obtained and final outcomes achieved.
The ICTA proposal:
- Setting up of a National Digital Ethics Committee (NDEC) as the decision-making body to: “investigate on illegal and harmful content on its own or through interaction with other stakeholders already involved in national security, crime investigation, detection and prevention or through complaints received;”
- Creation of a Technical Enforcement Unit to enforce the technical measures as directed by the NDEC.
- Use of a technical toolset whereby internet users in Mauritius will have to use a local proxy server as intermediary and will be asked to install a self-signed digital certificate allowing the ICTA server to be accepted as intermediary by the browser or the app, thereby bypassing the https protocol.
- The adjective “harmful” needs to be defined clearly to understand what is allowed and what is not allowed. Otherwise, there is a risk of potential abuse and misuse by authorities and any parties involved in the mechanism. Assuming that the intention of the current appointees of the ICTA with this proposal is genuinely to create a safe online public sphere in Mauritius and to work for the greater good rather than for vested interests, there is no guarantee that subsequent appointees would follow suit. Petty political games may ensue.
- The adjective “illegal” is clearer as this relates to the laws of the country, which are enacted by parliament, interpreted by courts of justice and implemented by policy-makers.
- The main issue with this proposal is a potential conflict with Section 12 of the Constitution as highlighted in the paper itself. The Constitution states that there can be no interference in our communications except with our consent and except for specific provisions involving public defence, safety, order, morality and health, reputation, privacy, courts authority, technical administration of telecoms and civil service restrictions. BUT, the provisions should be “justifiably reasonable in a democratic society” and the proposed amendments to the ICT Act do not seem “justifiably reasonable in a democratic society“.
- The proposal is tantamount to blanket surveillance on the citizens of Mauritius without judicial oversight. ALL citizens’ communication would be interfered with in a blanket manner. The proportionality test needs to be applied: Does this pass the balance of risks/benefits? This proposal seems to contradict the philosophy of the Data Protection Act which aims to protect citizens’ private data and which was adopted in 2017 (and modelled on the EU GDPR).
- No other democratic state is using such a system as the one proposed by the ICTA. There is a high probability that Mauritius will tumble down in world freedom rankings.
- In all the democratic states cited in the consultation paper, the responsibility for moderation and removal of illegal content remains vested with the social media platforms as these state agencies never endow themselves with any technical capacity to interfere, intercept and remove any content on social media in comparison with the Mauritian ICTA proposal.
- The mechanism for intercepting, decrypting and re-encrypting social media traffic would require enormous technical capabilities given the enormous amounts of data (photos, videos, livestreams, etc.) generated daily. It is also very much a moving target. Any social media network may change some implementation details which might make the system ineffective. Anyone can also bypass the system just by using a Virtual Private Network (VPN).
Potential alternatives to the proposed amendments
- The last question in the consultation paper asks citizens if they think that local courts of justice should be empowered to “impose sentences (which include banning use of social media) on persons convicted of offences relating to misuse of social media tools”. This may indeed be explored.
- Preventive measures such as education, sensitisation and digital literacy remain crucial to create a more civic space online.
- We could also explore the use of our diplomatic connections at the regional level to enter into negotiation with the social media platforms in order to improve their responsiveness to requests from local authorities (eg. AU, SADC).
- The public consultation formula is a positive process. In the same spirit, it is expected that there be a public report about inputs received (as announced by the ICTA) and how these inputs are tackled by the ICTA and any other authority involved in the process.
- On the basis of information provided so far, the present proposed amendments do not seem reasonable at all in terms of the proportionality of the measures with respect to the problem to be tackled. Rather, they represent a significant threat to our democratic setup and may have far-reaching repercussions not only on the social and political but also on the economic fronts.