[Update: The Madras High Court lifted its interim banagainst TikTok on April 25th, 2019.]
TikTok, a short-form video streaming app, has been banned by an interim court order in India for concerns over the exposure of youth to sexual predators, pornographic content, and cyberbullying.
TikTok was also recently fined by the US Federal Trade Commission(“FTC”) for breaching the Children Online Privacy Protection Act(“COPPA”) by collecting data from users under the age 13 without parental consent.
TikTok is at the center stage of the bigger discussion surrounding online youth protection.
WORLDWIDE POPULARITY – TIK TOK AND YOUTH ENGAGEMENT
Created by a Chinese tech giant, Bytedance Technology Co (“Bytedance”), TikTok is an app that allows users to create and share 15-second videos with quirky and musical special effects. The new generation of youth has made TikTok one of the most downloaded apps of the year, toppling over 100 million downloads on Google Play Store. Unfamiliar to most streamers over the age of 24, this app puts a vintage filter on millennials and their Instagram accounts. 40% percentage of TikTok users are 24-year old or younger, amongst which some are under the age of 13.[1]Youth all around the world browse, share, and comment on funny videos feeds – including India, a country counting 1.3 billion inhabitants and over 40 million active TikTok users.
INTERIM ORDER BAN – CHENNAI, INDIA
On April 3rd, 2019, the Madras High Court of India (the “Court”) issued an interim order directing the central government of India to ban TikTok app and prohibit its download. The order also required local media to stop diffusing content from TikTok. This interim order was issued in the context of an ongoing public interest litigation (“PIL”)alleging that the app’s content “degraded culture and encouraged pornography.” In compliance with this ruling, TikTok was removed from the Google Play Store and Apple’s App Store in India.
The Court found that it was evident that pornography and inappropriate content was available on the app, which could encourage the sharing of “child pornography” and spoil both the mindset and the future of young users. The Court also voiced concerns about “the possibility for children to contact strangers directly.”[2]
In India, there is no specific legislation protecting youth interest online like COPPA in the US. In its ruling of the TikTok ban, the Court reproached the Indian Government for its passivity in adopting an equivalent law to protect youth in India who are especially vulnerable to predators and hoaxes. As a reminder, the Court referred to the disturbing “blue whale” online suicide challenge. Though this challenge turned out to be a hoax, the exposure of youth to such dangers remains very possible. As well, back in October 2018, a case of cyberbullying on TikTok led to the tragic suicide of 24-year old in India.
TIKTOK CHALLENGES THE BAN
As an immediate response, a TikTok spokesperson told the press that it would await an official order from the court, following which the company would take appropriate action to abide by the local laws of India:
“At TikTok, we are committed to abiding by local laws and regulations. We fully comply with the Information Technology (Intermediaries Guidelines) Rules, 2011. We are currently awaiting the official order by the Honourable High Court of Madras and once received, we will review and take appropriate action regarding this matter. Maintaining a safe and positive in-app environment at TikTok is our priority. We have robust measures to protect users against misuse, protect their privacy and digital wellbeing.”
However, before abiding by local laws, TikTok first took steps to challenge and reverse the interim ban. On April 9th, 2019, the Supreme Court of India, the final court of appeal, agreed to hear a plea filed by Bytedancecontesting the Madras High Court order that directed the ban of the TikTok app. The plea was heard on April 15th, 2019, the Supreme Court declined to quash the order, opting instead to remit the matter back to the Madras High Court. Although Bytedance’s court filing is not publicly available, it was reported that the company raised arguments regarding freedom of speech. Furthermore, according to a court filing as seen by Reuters, the company argued that the interim ban resulted in a daily loss of $500,000 in financial revenue as well as 1 million new users. The company also alleged that the ban would prejudice its reputation and goodwill.
New development: On April 25th, the Madras High Court lifted its interim ban on the condition that TikTok should not allow obscene or pornographic videos to be uploaded on its platform. Since the PIL is still ongoing, the app could be banned again if the final verdict decides as such.
It is worth noting that this is not the TikTok first ban that Bytedance is contesting (or has successfully contested). Back in July 2018, TikTok successfully saw its ban overturned by the Indonesian Government upon agreeing (i) to clear all “negative content” from its platform and (ii) to increase security measures for users aged between 14-18.
ONLINE YOUTH PROTECTION – FILLING THE VOID
« If we can agree on anything, it should be that children deserve strong and effective protections online.” – Senator Ed Markey, 2019.
Though far from Canada, the interim ban of TikTok in India is a reminder of the regulatory void occupying online youth protection, an issue close to home. In Canada, Personal Information Protection and Electronic Documents Act (“PIPEDA”) does not make any distinction between minors and adults. Nonetheless, the Office of Privacy Commission (“OPC”) has issued some guidanceregarding this matter:
“The Office of the Privacy Commissioner of Canada (OPC) has consistently viewed personal information relating to youth and children as being particularly sensitive and must be handled accordingly. We have also taken the position that in all but exceptional cases, consent for the collection, use and disclosure of the personal information of children under the age of 13 must be obtained from their parents or guardians.”
In the US, in addition to the requirements and protection offered by COPPA, lawmakers have introduced a new bill to protect children’s data privacyin March 2019.
When it comes to protecting young minds from harmful online content, state regulators are lagging behind the fast-paced emergence of media apps – each app being more creatively engaging than the other. With every novel interactive app, it seems that newer threats appear for young users. Just last February, the “Momo Challenge”prompted a 12-year old girl with autism to light the kitchen gas stove in the middle of the night. “Momo” is a disturbing video-animated character reached out to young users primarily via WhatsApp, enticing them to participate in a game where they would have to take their lives and record it to share on social media. Parents can only do so much to limit the exposure to such content, especially when classrooms of developed countries are going through a micro-technological revolution.
For every country, the difficulties of legislating towards better online child protection lie in finding the appropriate security safeguards and then, ensuring the effective implementation thereof. Verifying the minimum age requirement based on users’ self-declaration (and self-administered email authentication) can be easily by-passed by all too eager minors – all the credit going to the 90’s kids. Robust legal protections remain fundamental nonetheless: efforts and expenditure will have to be invested not just into drafting a flexible, intelligible, and time-resistant law, but also into the development of novel tools for its practical implementation from the get-go. Who knows, perhaps we can find effectiveness through the creation of a new independent regulatory body (as the UKis proposing), a state funded online enforcement mechanism, or a cooperatively developed PhotoDNA AI tools to better filter harmful content (as WhatsAppis doing). Legislators will have to tap into their creative brains to stay relevant and innovate the traditional remedies of the Court, which remain steady but slow. When it comes to youth protection, it is in the best interest of children for countries to be ahead of the game with legislative baby steps.
[1]https://www.dw.com/en/tiktok-worlds-most-successful-video-app-faces-security-concerns/a-48063869
[2]https://in.reuters.com/article/tiktok-india-court/rpt-indian-state-court-asks-government-to-ban-inappropriate-video-app-tiktok-idINL3N21M15V
Commentaires