We, the undersigned organisations, note with great concern a petition tabled on 15 August 2023 before Kenya’s National Assembly seeking to ban TikTok. While we note the salient issues concerning TikTok’s content moderation practices that have led to the spread of hate and disinformation on the platform – particularly around important political events and minorities at risk — we also express our strong opposition to the proposed ban of TikTok in Kenya.
The Petitioner, Ben Ndolo, claims that content on TikTok is inappropriate as it “promotes violence, explicit sexual content, hate speech, vulgar language, and offensive behaviour which is a serious threat to the cultural and religious values of Kenya”. The petition further avers that the platform is not regulated by the Communications Authority leading to the failure to remove inappropriate and offensive content. It also cites TikTok for violating the privacy rights of children and its users and sharing people’s data with third parties without consent. Moreover, it states that the app is addictive and could lead to mental health issues such as anxiety and sleep deprivation.
Social media platforms, including TikTok, have emerged as spaces for creative expression, entertainment, information sharing, social interaction, public participation, awareness raising, education, and engagement on a wide range of issues. Platforms in this time and age have also become an alternative source of relevant and credible news and information not covered by traditional or mainstream media, a notice board for ongoing events, including for government bodies, and the new digital public square for discourse, engagement and the exercise of democratic rights.
While social media platforms have played an important role in the realisation of rights – including freedom of expression and access to information – the proliferation of harmful content on the platforms has had severe human rights impacts, particularly during times of crisis, such as in Ethiopia. Platforms and companies hosting social media platforms therefore have a responsibility to put in place measures to identify and address risks presented by harmful online content that can contribute to societal harm, and lawmakers have a responsibility to protect users from harm while also upholding free expression and other fundamental rights.
In 2022, the #KeepItOn campaign against internet shutdowns documented platform blocks or bans in 23 countries. Many of those governments imposed the platform blocks under the justification that the platform had failed to properly moderate harmful content. These blocks severely undermine freedom of expression and access to information, and yet there is no evidence that they meaningfully reduced the proliferation of harmful content or kept people safe. One such example includes Nigeria’s Twitter ban, which the ECOWAS court ruled unlawful. Importantly, Kenyan authorities have not taken the bait on this blunt force approach, including when authorities publicly declined to block access to Facebook during the 2022 election period.
Such a ban would, therefore, stifle the rights to freedom of expression, access to information, freedom of association and assembly and political participation. It would also adversely affect the economic rights of Kenyans, and hinder the country’s progress towards a more open and inclusive digital society. While these rights are not absolute, any limitation of them should satisfy the three-part test of proportionality, necessity and legality. A total ban would not satisfy these principles. Several members of parliament have also noted the disproportionality of the petition’s prayers.
Kenya has a robust policy, legal, institutional, and regulatory framework for the ICT sector. While there is room for debate on how these frameworks can be further strengthened to tackle harmful online content, experts have warned that over-regulation of content moderation by states – including full banning of platforms – can lead to the suppression of legitimate discourse and dissent. Overreaching regulation of social media platforms can also lead to market uncertainty with destabilising impacts on people in the most vulnerable economic sectors. And where regulations rely on broad and subjective legal definitions (e.g. targeting of “inappropriate or offensive” content), it often results in arbitrary enforcement and abuse by authorities.
We call upon the Public Petitions Committee and the National Assembly to:
- Reject the proposal to ban TikTok or any other social media platform in Kenya.
- Adopt a holistic and human rights-based approach in line with constitutional and international human rights standards to ensure the protection of citizens’ fundamental rights while addressing legitimate concerns surrounding social media use.
- Engage in open and inclusive multi-stakeholder dialogue with all relevant stakeholders (including academia, civil society, tech companies, government, and media) to develop effective and evidence-based approaches to address concerns surrounding social media platforms.
- Promote media information literacy and digital literacy programmes to educate citizens, with digital skills to harness online opportunities, be responsible digital citizens, enjoy their digital rights, and ensure their online safety.
We call upon TikTok to:
- Invest in adequate administrative, human, technical and financial resources in content moderation to ensure their policies, processes, structures and practices are rights-respecting, context- and culture-sensitive, efficient and effective, particularly in their non-English speaking and global majority markets.
- In alignment with the UN Guiding Principles on Business and Human Rights (UNPGs), provide periodic public reporting on progress toward the adoption and implementation of the above measures.
- Engage and collaborate with stakeholders – including academia, civil society, government, independent experts and media – in a meaningful way to get insights to ensure platform policies and enforcement practices are responsive, proportionate and consistent with human rights standards and best practices.
- Streamline transparency and accountability mechanisms, including by providing regular and comprehensive transparency reports with disaggregated data by country of the content removed from their platform and insights on how content is reviewed, flagged and removed. Such regular transparency reporting is already a de-facto standard for the tech sector in general.
- Empower users to protect themselves from harmful content, including by adopting simplified community guidelines or standards; ensuring mechanisms for reporting harmful content and abuse are efficient and user-friendly; implementing effective remedial mechanisms for users; educating users about the risks of harmful content; providing tools and options to users to manage their online experiences such as age and content filters; and engaging in educational campaigns to promote safe and responsible platform usage and digital literacy.
- Apply principles of transparency, confidentiality, purpose limitation and data minimization to their data practices, to ensure that users have adequate knowledge and autonomy over the processing of their data.
- Access Now
- ARTICLE 19 Eastern Africa
- Bloggers Association of Kenya (BAKE)
- Haki na Sheria
- Kenya Human Rights Commission (KHRC)
- Paradigm Initiative