4.1 Introduction
While the US-based platforms like Facebook, YouTube, and Twitter dominate the social media markets in most countries, China, as the world’s second largest digital economy,Footnote 1 has its own platform ecosystem, where indigenous platforms like WeChat, Weibo, and Douyin are the major players.Footnote 2 China’s authoritarian political system, together with this unique platform ecosystem, has shaped how it approaches platform responsibility for content moderation.
In liberal democracies, debates around platform responsibility have focused on whether to and increasingly how to regulate platforms and mainly revolve around balancing the two oftentimes incompatible objectives: making platforms accountable for illegal or harmful content and protecting freedom of expression. In contrast, Chinese platforms, despite holding dominant market power over online information distribution and consumption,Footnote 3 fall under the heavy regulation of the government and are required to take the “primary responsibility” (主体责任) for content governance (to be discussed later).Footnote 4
This chapter examines how China approaches platform responsibility for content moderation, bearing in mind its relevance to the global debates around this issue. First, it discusses the general principles for China’s online content governance. This is followed by a more detailed analysis of how China defines illegal and harmful content, sets platform obligations, and enforces regulations. As part of a wider project that compares the platform responsibility in different jurisdictions, this chapter then conducts a brief case study on TikTok, the China-originated video-sharing platform that has gained global popularity, to illustrate the following interactions: how China’s laws and regulations influence TikTok’s content moderation policies and practices in overseas markets and how TikTok’s content moderation is influenced by other nations’ laws and regulations.
4.2 General Principles of China’s Online Content Governance
Before analyzing China’s approach toward platform responsibility, it is necessary and useful to understand the general principles of China’s online content governance. Such principles mainly include differentiating the regulation of traditional media and online media, requiring websites and online platforms to take “primary responsibility” for content governance, exerting stricter control over “news” content, and centering on public opinion management (or control).
4.2.1 Differentiating Traditional and Online Media
Compared to the regulation of traditional media, which are mostly state-owned or state-controlled (in terms of ownership), China exerted a looser control over digital media services, at least until a decade ago when online content platforms emerged as the main gateways for online news and information. Chinese traditional media outlets (such as newspaper, radio, television, and magazine) are required to have a sponsor unit (主管主办单位) recognized by relevant media regulators, apart from meeting other preconditions.Footnote 5 Such sponsor units can be a department of the party or the government, a state-owned enterprise, a public institution, or an official media organization.Footnote 6 They retain “ultimate responsibility” over the content published by their affiliated media outlets and thus have a strong “incentive” to ensure they comply with government requirements.Footnote 7 In contrast, digital media services do not necessarily need such a sponsor unit, making it possible for private companies to enter the digital media market. Currently, all of China’s major social media platforms are owned by private internet companies.Footnote 8 As these platforms do not have a sponsor unit, they maintain more discretion on content moderation than traditional media.
Both traditional and online media are required to obtain a license from relevant media regulators in China. Unlike regulatory agencies in liberal democracies, which “typically operate at arm’s length from the government,” China’s media regulators are government departments subject to the directives of the Chinese Communist Party (CCP)’s Publicity Department (PD).Footnote 9 China has different regulators for traditional and online media. The National Radio and Television Administration (NRTA) and the National Press and Publication Administration (NPPA) are the regulator for radio/television sector and the print media respectively. The Cyberspace Administration of China (CAC), which is also the office of the Central Cyberspace Affairs Commission (CCAC) led by President Xi Jinping, is China’s central and primary online content regulator, overseeing all online content (including online content produced by traditional media organizations).Footnote 10 While the regulatory responsibilities of the NRTA and NPPA mainly lie in license reviewing and issuing for traditional media outlets, they (as well as other government ministries such as the Ministry of Culture and Tourism) occasionally issue content guidelines that may also apply to online content and impose punishment on platforms.Footnote 11
4.2.2 IISPs: “Primary Responsibility” for Content Governance
Just like traditional media, Chinese internet information service providers (IISPs, including social media platforms) are required by government regulators to moderate content they host. This requirement can be traced back to the Regulation on Internet Information Services (RIIS) in 2000 (China’s first of its kind), which requires all IISPs to monitor illegal content and stop immediately the distribution of such content when detected.Footnote 12 In recent years (since around 2014), as platforms increasingly become the dominant gateways and gatekeepers for online information, Chinese authorities have repeatedly emphasized that websites and platforms must take “primary responsibility” for content governance. For example, during a symposium on Cyber Security and Informatization in 2016, President Xi noted, regarding the governance of online information, “Internet companies should take the primary responsibility and government regulators should strengthen their supervision, and they should establish a relationship of close cooperation and coordination.”Footnote 13 This “primary responsibility” requirement has been included in all relevant regulations concerning platform responsibility since 2017, illustrating that it has become a basic principle for China’s online content governance. Here, Chinese platforms’ “primary responsibility” mainly concerns user-generated content, as platforms generally don’t moderate content from media organizations and government-run accounts (although such content is still subject to platforms’ keyword filtering systems and user complains and reporting systems).Footnote 14
China’s approach regarding platform content responsibility has been classified by MacKinnon et al. as “strict liability,” compared to the models of “broad immunity” (e.g., in the US, where platforms enjoy the “safe harbor” protection respect to third-party content with a few exceptions like sex-trafficking information) and “conditional liability” (e.g., in the EU and many countries, where platforms enjoy immunity if they take down illegal content upon actual knowledge of its existence).Footnote 15 It is worth noting that, just like in the EU and many other countries, Chinese platforms also enjoy “conditional liability” regarding tort damages. According to the Civil Code of the People’s Republic of China (which replaced China’s Tort Liability Law and several other civil laws in 2021), if online service providers take necessary measures in a timely manner upon receiving notices from the infringed users, they are not held liable for the damages.Footnote 16 Therefore, classifying China’s approach of platform responsibility as “strict liability” can be misleading. In other words, Chinese platforms may be “strictly liable” to government regulators for hosting some types of banned content (as defined later, such as politically sensitive content), but they only have “conditional liability” for infringements of users’ rights and interests caused by content they host.
4.2.3 Stricter Control over “News” Content
Chinese media and internet regulators set out stricter rules over media services that produce or host news information. According to the Regulation on Internet News Information Services (RINIS), IISPs that produce, host, or reprint news content are required to obtain an Internet News Information Service (INIS) license from the CAC or its local offices.Footnote 17 Here, “news” content is defined in a narrow sense, referring to news reports and commentaries about public affairs (including politics, economy, military, and foreign affairs) and breaking social incidents.Footnote 18 Thus, sports or entertainment news is generally not subject to this limitation in China.
An INIS license specifies which type(s) of news services the license-holder can provide: news gathering and production, news reprinting, and communication platforms. Only news organizations and their controlled subsidiaries can apply for a license for news gathering and production, which means privately owned online platforms can only apply for a license to host news (as a communication platform) and/or “reprint” news from allowed news sources. At the time of writing, the most recent list of allowed internet news providers, released by the CAC in 2021, contains a total of 1,358 allowed news sources at both the national and local levels, including traditional media outlets, news and government websites, and Weibo and WeChat public accounts run by media organizations and government departments (or party and public institutions).Footnote 19 While the list greatly expanded the scope of allowed news sources for reprinting from its 2016 version, Caixin, one of China’s best-known investigative journalism outlets, was removed from the list, which was seen as a new signal of China’s increasingly tight media controls by some overseas media organizations.Footnote 20
The limitation on news gathering, production, and reprinting is an important rule that allows Chinese government regulators to control the sources of news. However, it can be difficult to define what is or is not “news” content in practice. For example, many individuals publish commentaries on social issues of public concern through their social media public accounts, a considerable percentage of which could be classified as news commentaries. As a response, the CAC promulgated in 2021 the Regulation on Public Accounts Information Services of Internet Users, which stipulates that both public account operators and online platforms must obtain an INIS license to publish or host news content.Footnote 21 While this regulation demonstrates Chinese regulators’ concerns over social media being used by individuals to bypass government controls over news sources and exerts more pressure on both platforms and individual authors, the difficulty for online platforms to judge what constitutes “news” is still there and no individuals have so far been granted an INIS license. In practice, major platforms like WeChat and Weibo often (intentionally or unintentionally) allow or temporarily allow some content from individual authors that may challenge official narratives.Footnote 22
China’s control over news content is closely linked to its policy toward foreign social media platforms. This is because social media platforms usually host a huge amount of news content and thus must apply for an INIS license, at least in principle.Footnote 23 To apply for an INIS license, an entity must register in China and its main executives and editor-in-chief must be Chinese citizens.Footnote 24 In addition, foreign capital is not allowed to establish an INIS, even in the form of partnership with Chinese capital.Footnote 25 These limitations mean foreign social media platforms are de facto blocked in China. In October 2021, LinkedIn announced that it would shut down its social networking business (but would launch a jobs-only site) in China, due to facing a “challenging operating environment and greater compliance requirements.”Footnote 26 The closure of LinkedIn’s social networking business means that all major foreign platforms are now either blocked, or have retreated, from China. As for why LinkedIn had been allowed to operate in China, one possible explanation is that the platform is generally viewed as a career-networking site, rather than a typical social media platform that hosts news content. In addition to strict control over news-related platforms, China’s requirements for platforms regarding content moderation (discussed later) also make it very difficult, if not impossible, for foreign social media platforms to operate in the country.
4.2.4 Content Governance Rationale: Public Opinion Management
The rationale behind China’s stricter control over news content lies in Chinese authorities’ concerns over public opinion management since news content is deemed as having the power or potential to influence public opinion (as illustrated in the common Chinese phrase “news and public opinion work”). This is exactly why China imposes extra obligations on platforms “with characteristics of public opinion or capable of social mobilization,” which are required to conduct security assessments when launching new applications or new technologies.Footnote 27 China blocks foreign social media platforms for the same reason, given that global platforms like Twitter and Facebook have proven to be important tools facilitating communications during political turmoil or uprisings, especially seen in the so-called Arab Spring.Footnote 28
As the fundamental rationale for China’s online content governance, public opinion management, which comprises both public opinion guidance and public opinion supervision,Footnote 29 is embodied in all Chinese regulations concerning platform responsibility. For example, the above-mentioned RINIS states that INIS providers must “stick to correct guidance of public opinion, play the role of public opinion supervision, and facilitate a positive and healthy Internet culture.”Footnote 30 However, in these regulations, the purpose of public opinion management is often framed as promoting public interest, ensuring online safety, and maintaining social order in China.Footnote 31
4.3 The Chinese Approach toward Platform Responsibility
China adopts a patchy framework and an iterative approach in setting out platform responsibility for content moderation. During the last decade, China has promulgated around two dozen separate regulations in this area. Some of these regulations concern all types of platforms; and others target a specific type of online platform (such as instant messaging, microblogging, and livestreaming platforms), a specific technology used by platforms (e.g., blockchain, algorithmic recommendation, and deepfake), a specific service provided by platforms (e.g., online comments, public accounts, and online groups), or a specific issue brought about by platforms (e.g., personal data protection and consumer/user protection).Footnote 32 Despite its patchy nature, China’s platform regulation has demonstrated considerable consistency in defining illegal and harmful content, platform obligations, and enforcement methods.
4.4.1 Defining Illegal and Harmful Content
Building on earlier laws and regulations, China’s Regulation on Governance of Online Information Ecology (RGOIE), which took effect March 1, 2020, defined three types of online content: banned (or illegal) content, harmful content, and positive content. It requires content producers and online platforms not to produce or distribute “banned” content, to prevent and resist “harmful” content, and actively to produce or distribute “positive” content.Footnote 33 Content guidelines of Chinese platforms often combine these requirements with their own rules (such as rules concerning content monetization and user reports and complaints).
The RGOIE specifies a series of “banned” content,Footnote 34 which can be classified into three types. The first type is of a “political” nature, including content that is (1) against the basic principles of China’s constitution (such as China’s socialist system and the leadership of the CCP); (2) jeopardizing national security or national unification, leaking national secrets, subverting state power, and damaging national interests; (3) smearing national heroes and martyrs; (4) propagating or inciting terrorism and extremism; (5) sabotaging China’s national unity or religion policies through inciting hatred and discrimination among different ethnicities or propagating cults and feudalistic superstition; or (6) spreading rumors and disturbing economic and social orders. The second type of banned content includes pornography, gambling, and inciting violence, murder, or other crimes. The third type contains defamation, infringing others’ legal rights and interests such as reputation and privacyFootnote 35 and other illegal content according to the laws and regulations (such as copyright-infringing content and selling illegal items). These types of banned content reflect the characteristics of China’s legal system. For example, spreading pornographic content through the internet and gambling (for profits rather than for recreation) are potentially criminal offenses in China mainland (Criminal Law, chapter six). China’s defining of banned content, especially content of a “political” nature, concerns public opinion management. For instance, adding “smearing national heroes and martyrs” to the list of banned content has been viewed by some as the CCP’s attempt to control the narratives about China’s history, as heroes and martyrs often represent values endorsed by the authority and relate to the interpretation of history. In recent years, several Chinese platforms have been censured or punished for hosting content that was deemed as “smearing national heroes.”Footnote 36
In contrast, “harmful” content listed in the regulation is mainly concerned with social stability and morality. The listed types of harmful content include: (1) exaggerating or sensational titles; (2) sensationalizing scandals, gossips, and misdeeds; (3) insensitive comments on natural disasters or severe accidents; (4) sexual content; (5) violent and graphic content; (6) inciting discrimination among different groups and people from different places; (7) propagating vulgar and low-taste content; (8) enticing minors to imitate dangerous acts or developing bad habits; (9) other content harmful to online content ecology.Footnote 37 The emphasis on social stability and moral goodness is an important feature of China’s online content governance,Footnote 38and the government has legalized its online content governance (sometimes content censorship) through framing it as benevolence and protection.Footnote 39
In addition, the regulation (RGOIE) lists several types of “positive” content that include propagating major policies and strategies of the Party, highlighting China’s economic and social development, effectively responding to public concerns and guiding the public to develop consensus, displaying a multi-facet China, etc. It encourages online platforms to display and present positive content prominently.Footnote 40 This aligns well with the Chinese party-state’s rationale for online content governance: public opinion management. Its goal is to build an online ecology that propagates “positive energy”Footnote 41 that may help to distract the public’s attention from the negative news and criticisms of the government. As Bandurski observes, the term “positive energy” is “at the very heart of political discourse” in the Xi Jinping era, a tool for internet governance and control.Footnote 42
4.3.2 Protection of Users and Privacy
While China’s platform governance has centered on public opinion management, government regulators have paid increasing attention to the protection of users (especially minors) and personal data in recent years, showing the adaptability of the party-state regarding regulation in the platform era. For example, the Regulation on Algorithmic Recommendation of Internet Information Services (RARIIS), which took effect in March 2022, includes a dedicated section for user protection. The regulation requires algorithmic recommendation service providers (ARSPs) to protect the rights and interests of different types of users, including minors, the elderly, consumers, and workers (who rely on platforms for orders). The RARIIS stipulates that ARSPs (including social media platforms) should not recommend content that may lead minors to imitate dangerous acts or develop bad habits such as addiction,Footnote 43 should strengthen the monitoring and moderation of online fraudulent information to protect the elderly,Footnote 44 and their algorithms should ensure fair treatment of consumers and workers.Footnote 45
Another example of user protection is witnessed in the Chinese government’s policy document targeting livestreaming platforms, issued in February 2021.Footnote 46 A major aim of this guideline document is to protect minors on these platforms. It specifies that platforms should not allow users under sixteen to open a livestreaming host account and should seek the consent of the guardians of minors between sixteen and eighteen before allowing them to open an account.Footnote 47 In addition, livestreaming platforms should develop a “minor mode” for minor users and block content that is harmful to them (such as obscene and pornographic content). Another purpose of this policy document is to protect users from excessive and irrational consumption (such as paying high tips to livestreaming stars) – a prominent phenomenon on Chinese livestreaming platforms. The guideline document requires platforms to set a series of limits regarding tips, such as the total amount of tips per day from a single user.Footnote 48 It also requires platforms to not allow minors to tip livestreaming hosts; when a tip is verified from a minor who used an adult account, platforms should refund the money.Footnote 49
As for the protection of personal data, China has issued relevant laws, regulations, and policy notices in recent years, including the Personal Information Protection Law (PIPL)Footnote 50 and the policy notice on Defining the Scope of Necessary Personal Information for Common Types of Mobile Apps.Footnote 51 Among them, China’s PIPL establishes a series of basic principles for personal information collection, including informed consent from users, minimum collection (only collecting necessary information), special care for “sensitive” personal data, and consent from parents or guardians of minors who are under the age of fourteen.Footnote 52 The above-mentioned policy noticeFootnote 53 defined what is the “necessary” personal information that various types of platforms need to collect when providing basic services. For example, for instant messaging platforms, the necessary personal information includes users’ telephone numbers and accounts of their contacts;Footnote 54 for social networking platforms, such information only includes users’ telephone numbers;Footnote 55 while for livestreaming platforms and short-video platforms, no personal information is necessary for providing basic functions.Footnote 56
On November 25, 2022, the CAC, in conjunction with the Ministry of Industry and Information Technology (MIIT) and the Ministry of Public Security (MPS), issued a new regulation governing deep synthesis technology and services, commonly known as “deepfake” technology. This regulation, officially titled the Regulation on the Administration of Deep Synthesis of Internet Information Services (the Deep Synthesis Regulation),Footnote 57 came into effect on January 10, 2023. Deepfake technology, which involves the use of advanced machine learning and AI to create or alter visual and audio content, can be used to generate synthetic media where a person’s likeness in an image or video is replaced with another’s. The Regulation is part of China’s effort to increase supervision over this technology. It stipulates that providers and deployers of this technology are not allowed to produce and spread illegal content and fake news information;Footnote 58 and deepfake service providers should label content produced by this technology in a way not affecting the use of end-users and make some deepfake services such as synthetic face and voice and face swapping immediately recognizable if they may mislead the users.Footnote 59 These stipulations reflect the party-state’s concerns that this new technology may be used to influence public opinion and social stability, but they also concern the protection of users’ rights and interest.
4.3.3 Major Obligations for Platforms
China’s regulations regarding platform responsibility for content moderation have laid down a series of obligations for platforms, which form the specifics of what the government calls platforms’ “primary responsibilities.” Major obligations include:
Establishing an editor-in-chief: This requirement is for all INIS providers,Footnote 60 including social media platforms. It means social media platforms, just like traditional media, must have an editor-in-chief who takes ultimate responsibility for content on their sites.
Real-name user registration: While internet users can use most services provided by platforms without registration, they must register with their real identities before they can post content (although they can still use nicknames for account names). Platforms are required to verify users’ identities through mobile phone numbers (which also require real-name registration in China), ID cards, and other methods.Footnote 61
Real-time content monitoring and moderation: Under the above-mentioned US and EU “broad immunity” and “conditional liability” models, platforms are not subject to “general monitoring or active fact-finding obligations.”Footnote 62 In contrast, Chinese platforms are required to conduct “real-time” monitoring and moderation of content.Footnote 63
Including links to government-run user reporting website: All Chinese IIS providers are required to include links, in a prominent way, to the website of Illegal and Harmful Information Reporting Center (12377.cn) run by the CAC.Footnote 64 Through the link, users can easily report various types of illegal and harmful content, such as political sensitive information, rumors, fraud information, and pornographic content.
Keeping user records: All IISPs are required to keep user records for at least sixty days.Footnote 65 For some types of platforms such as microblogging platforms, the required user-record keeping period is 6 months.Footnote 66
Grading-and-classifying (分级分类) management mechanism: Platforms are required to establish this management mechanism regarding user accounts and user content. Here, “grading” means platforms should assess the credit of user accounts and provide services to them accordingly. If certain users are found to have posted illegal or harmful content on a platform, then their credits should be downgraded by the platform and thus the services they can use will be limited accordingly. For users who have seriously breached relevant laws or regulations such as posting rumors, platforms should add them to a “blacklist” and take corresponding measures (e.g., closing their accounts and preventing them from re-registration under another name).Footnote 67 “Classifying” means platforms should classify user accounts and content into different categories according to factors including the number of followers and the content areas (such as political, economic, and entertainment).Footnote 68 For user accounts producing content in areas such as politics, platforms are required to exercise stricter monitoring and moderation. According to the CAC, this management mechanism aims to achieve a “precise, focused, and dynamic” management of user accounts,Footnote 69 forcing platforms to concentrate their moderation resources on key content areas and influential user accounts.
3.4.4 Enforcement Methods
China adopts a mixed method in enforcing regulations regarding platform responsibility. While civil and criminal laws are applicable, administrative measures are China’s main method in pressuring platforms to fulfill the “primary responsibilities” for online content governance.
China’s Civil Code has articles about the liability of internet service providers (ISPs) regarding tort damages. According to the law, if an internet user commits a tort, the injured person shall be entitled to inform the ISP and ask it to take necessary measures (such as content deletion or blocking); the notice to the ISP should include initial evidence and the real identity of the infringed.Footnote 70 After receiving the notice, the ISP should send the notice to the concerned internet user and take necessary measures based on the initial evidence in a timely way; otherwise, it is jointly liable for the extended damage of the tort.Footnote 71 The infringed is also liable for the damage to the internet user and the ISP for wrong notice.Footnote 72 If an ISP is aware or should be aware that an internet user is infringing the civil rights and interests of others and fails to take necessary measures, it shall be jointly liable for the infringement.Footnote 73
Criminal laws are also applicable to platforms for hosting illegal content. So far, it is rare that platform or website executives have been prosecuted according to criminal laws. Kuaibo, once a popular but now-defunct video-streaming platform in China, is a case in point that involves criminal laws. The platform, allowing users to watch pirated videos through P2P technology, was found to host a huge number of pornographic videos. In 2016, Wang Xin, the CEO of Kuaibo, was sentenced to forty-two months in prison and fined 1 million Yuan (around $150,000); and three other senior executives were also given prison terms.Footnote 74 This case attracted great public attention in China and signaled the resolve of government regulators to hold platforms liable for hosting illegal content like pornographic videos. According to a judicial interpretation document from China’s Supreme Court and Supreme Procuratorate, hosting over 200 illegal videos (in the case of Kuaibo, over 30,000 pornographic videos) is classified as “leading to the spread of a great deal of illegal information” and is against China’s Criminal Law.Footnote 75
In addition to civil and criminal laws, China mainly resorts to administrative measures to enforce platform responsibility. Such measures mainly include government-initiated “internet-cleaning” campaigns, summoning platform executives, ordering platforms to suspend content updating or even close a service, ordering app stores to remove the concerned apps (usually temporarily), and issuing fines. For example, during various internet-cleaning campaigns, online platforms are required to self-check their sites thoroughly and deal with problematic user accounts and content. As a result of a campaign targeting the “chaotic situation” around public accounts run by individuals on platforms in November 2018, 9,800 public accounts were suspended or closed across platforms.Footnote 76 Platform executives (mostly editors-in-chief) have been frequently summoned by government regulators, especially the CAC and its local offices, in recent years. During these summoning meetings, government regulators often point out the existing problems of the concerned platforms and require them to redress them.Footnote 77 Sometimes, platform summoning sessions are accompanied with a fine in more serious breaches. For example, from January to November 2021, Weibo was frequently summoned by government regulators and fined forty-four times, totalling at 14.3 million Yuan (around 2.2 million dollars).Footnote 78
It is worth noting that while Chinese regulators do use fines as a punitive measure to ensure platform compliance, the amount is not comparable with some Western regulators. For instance, the EU’s DSA imposes fines up to 6 percent of the global revenue of a platform for serious breaches.Footnote 79 In contrast, Chinese regulators rarely issue heavy fines on platform companies, with the 18.2 billion Yuan fine on Alibaba for antitrust reasons in 2021Footnote 80 and the 8.026 billion Yuan fine on Didi for violation of network security and data security laws in 2022 being two exceptions.Footnote 81 For violations of content-related regulations in China, the highest amount of a fine is 500,000 Yuan.Footnote 82 However, unlike their Western peers, Chinese platforms seldom appeal against the decisions of government regulators, despite having such rights according to China’s Administrative Procedure Law (amended in 2017). This demonstrates China’s authoritarian nature and the asymmetrical power relations between the Chinese government and online platforms, given that the government can simply revoke the license of a platform under extreme circumstances.
3.4 A Brief Case Study on TikTok
This section conducts a brief case study on the video-sharing platform TikTok and its sister platform Douyin (available only in China), owned by the Beijing-based ByteDance, to illustrate how China’s relevant laws and regulations influence the content moderation practices of TikTok in overseas markets. Meanwhile, by examining how TikTok adjusts its content moderation policies in overseas markets, this study also sheds light on how other nations’ laws and regulations shape the practices of a global platform with a Chinese origin.
TikTok and Douyin are two similar but separate apps. Douyin, launched in 2016, is China’s most popular short-video platform with over 600 million daily active users as of August 2020;Footnote 83 while TikTok, launched in 2017 in global markets, is the world’s leading destination for short videos, boasting over 1 billion monthly active users as of September 2021.Footnote 84 The two platforms share the same logo and most user interface features (though Douyin has more advanced e-commerce features at the time of writing). They also have similar recommending algorithms that are mainly based on users’ interests. However, ByteDance has established a “wall” between these two platforms: Chinese mainland users can only download Douyin and overseas users can only access TikTok. Thus, the two platforms have different users and user content, and their user data are also stored in different locations: Douyin’s data is stored in mainland China; TikTok’s data is stored in Singapore, the United States, and soon also in Ireland and Norway.Footnote 85 While sharing many features, content moderation practices and moderation algorithms of the two platforms are quite different, given that they need to comply with laws and regulations in respective jurisdictions.Footnote 86 Douyin must comply with China’s content governance requirements, including carrying out real-time content monitoring (through keywords filtering and others) and censorship of certain content; while TikTok is subject to the laws and regulations of countries and regions where it operates.
In overseas markets, content moderation policies and practices of TikTok have been criticized for various reasons. In liberal democracies, TikTok has been accused of censoring content that may displease Beijing. In 2019, The Guardian reported that TikTok’s leaked internal documents instructed its moderators to censor videos that mention Tiananmen Square (i.e., the 1989 political turmoil in Beijing) and Tibetan independence.Footnote 87 The platform has also been criticized for censoring content from black creators and other marginalized groups,Footnote 88 and its internal content guidelines were exposed to even require moderators to suppress posts by users deemed too “ugly” or “poor” for the platform.Footnote 89 In response, TikTok claimed that the leaked moderation guidelines were either outdated or never put into use,Footnote 90 and other alleged censorship incidents were due to mistakes of automatic moderation (in the case of blocking content from some black users). Apart from censorship criticism, the platform has also been temporally banned or threatened to be banned for hosting “immoral” or “obscene” content in more conservative jurisdictions such as Pakistan, Bangladesh, and Indonesia.Footnote 91 In some authoritarian countries like Vietnam, the platform has been blamed for spreading anti-government content and for not doing enough (as Douyin does in China) in dealing with addiction and thus posing a threat to the country’s youth population.Footnote 92
The above criticism over TikTok’s content moderation illustrates how China’s laws and regulations may have influenced the platform’s overseas practices. Keywords like “Tiananmen Square [turmoil]” and “Tibetan independence” are taboos on the Chinese internet, indicating that TikTok, at least at its initial stage, may have adopted a keyword filtering system that was required by China’s regulations. Even TikTok’s bizarre rule regarding being “too ugly” can also be traced back to its Chinese origin, as Douyin was once famous in China for its posh content creators – a strategy to attract users when the platform was newly launched. Such explicitly discriminatory content policy wasn’t a big issue in China, partly because government regulators’ focus has been on content concerning public opinion management (as discussed earlier). As a result, Chinese platforms usually pay much more attention to politically sensitive content (i.e., content that may damage the image of the CCP and government) than to other types of problematic content such as discrimination, obscene and vulgar content.
However, content moderation policies and practices of TikTok have also been shaped by laws, regulations, and norms of overseas markets, to a much greater extent. TikTok’s Community Guidelines include sections Youth Safety and Well-Being, Safety and Civility, Mental and Behavioral Health, Privacy and Security, Sensitive and Mature Themes, etc.Footnote 93 While the guidelines address some concerns mostly relevant to TikTok, such as Mental and Behavioral Health that focuses on issues like suicide and self-harm, disordered eating and body image, dangerous activities and challenges, they are not that different from the Community Guidelines of other global platforms like YouTubeFootnote 94 and Facebook. Also, like its Western peers, TikTok publishes quarterly transparency reports since January 2021, including its Community Guidelines enforcement reports.Footnote 95 In contrast, the Community Guidelines of TikTok are totally different from those of Douyin. The User Service Agreement of Douyin lists over twenty types of banned or harmful content,Footnote 96 most of which are identical or similar to those listed by the government in the above discussed RGOIE. TikTok’s Community Guidelines demonstrate its efforts in presenting itself as a global rather than a Chinese platform, following “international legal frameworks” and “industry best practices” (shaped by major Western platforms like YouTube and Facebook).Footnote 97
In addition, when implementing Community Guidelines, TikTok has adopted a “localization” strategy. In March 2020, ByteDance dismantled its entire Beijing team responsible for overseas content moderation and assigned the task to overseas teams.Footnote 98 The company stated that its regional and country teams “localize” content moderation in accordance with “local laws and norms.”Footnote 99 Besides, ByteDance appointed Shouzi Chew from Singapore as TikTok’s CEO and Vanessa Pappas (former head of TikTok US) as the platform’s COO in 2021 (Pappas stepped down from the role in June 2023).Footnote 100 While TikTok’s localized moderation strategy may help to alleviate criticism over its content moderation to some extent, it is worth mentioning that it also caused some controversy within China. On China’s online Q&A platform Zhihu (China’s equivalent of Quora), there are discussions about whether TikTok is still a Chinese company since the platform hosts lots of “anti-China content” and its Chinese owner does not have (or voluntarily gives up) control over its content moderation policies.Footnote 101
3.5 Conclusion
This chapter examined China’s general principles for content governance and its approach to platform responsibility, both of which have distinct Chinese characteristics. As regards general principles, it is important to understand China’s fundamental rationale for online content governance: public opinion management. Following this rationale, China exerts stricter control over all online information services (including social media platforms) that host “news” content, which de facto blocks foreign social media platforms from operating in China. The Chinese characteristics of platform responsibilities are embodied in its defining of illegal and harmful content, its heavy platform obligations, and administrative measures for enforcement, all of which reflect China’s authoritarian nature and the asymmetrical power relations between the government and private platforms. China’s requirements for platforms to proactively monitor, moderate, and sometimes censor content, especially political sensitive content, make it almost impossible for foreign social media platforms to survive without full compliance. Despite its authoritarian nature, China has also demonstrated adaptability in developing its constantly evolving platform regulation framework. This is especially shown in its many regulations regarding user and privacy protection.
The case study on TikTok (and Douyin) shows that a global platform’s content moderation policies are shaped by both its original country and the region and countries where it operates. On the one hand, the laws and regulations of the original country may have impact on users from other jurisdictions through a global platform’s content moderation practices. In the case of TikTok, China’s laws and regulations may have influenced its content moderation in global markets. This was certainly the case before March 2020, when the company dissolved the Beijing-based moderation team. On the other hand, content moderation of a global platform is also influenced by local laws and norms. This is clearly illustrated in the case of TikTok, which has regional and country-specific teams tasked to carry out content moderation.