TikTok Challenge: Ticking Danger

7151
A youth in Tumkur, Karnataka, succumbed to his injuries while doing a back flip for a video

Above: A youth in Tumkur, Karnataka, succumbed to his injuries while doing a back flip for a video

Though the Chinese app TikTok was banned by the Madras High Court and removed from Google Play Store, it would be in the fitness of things for the government to give guidelines for clarity

By Na Vijayashankar 

TikTok is a popular short video sharing social media/mobile app which has been in the news recently. It was banned by the Madras High Court and removed from Google Play Store. The Supreme Court has refused to intervene with the decision of the High Court, and the controversy has not died down. It throws up several points of law and propriety—should the Court intervene and get the app removed?

TikTok enables sharing of a 15-second video, and like Twitter, has become immensely popular. Owned by a Chinese company, it is said to have over 1.2 billion downloads. It is especially popular in smaller towns and cities in India and is known for objectionable content which is harmful to children. Initially, objections came up in the Madras High Court because the app was used to upload pornographic material.

However, other problems too have surfaced. There have been many deaths linked to the use of TikTok, caused by adventurous users trying to create sensational videos.

In one case, a youth in Tumkur, Karnataka, broke his neck while doing a back flip and died. In another incident in Delhi, one person reportedly shot his friend while making a video for TikTok. In another incident, three youths riding a bike and trying to make a TikTok video met with an accident, and one of them died. There was also an incident of a 12-year-old boy hanging himself while participating in a “TikTok challenge”.

Most or all of these incidents can be ascribed to “unintentional accidents”, but the fact is that these accidents were prompted by a desire to make a TikTok video and become a celebrity overnight. This is a matter of concern. The craze for getting “likes” on TikTok videos, just as on Facebook, poses such risks.

There have also been complaints about the app sharing user data with Chinese authorities. But this is not substantiated and is applicable for all Chinese-originating apps, computers and embedded software. It is not specific to TikTok. The owners of TikTok have complained that they are being unfairly targeted and the problem lies elsewhere. They feel that short videos can promote creative use of the video format of communication and could be a funny pastime for users. Lawmakers, however, consider TikTok an intermediary and feel it should be held responsible to some extent for the damage it creates to society.

While TikTok, like other social media platforms, is capable of being abused, whether the law should be invoked to ban it is debatable. One school of thought is that the issue is of “freedom”, both for users who would like to showcase their talents and the company to do a business of its choice. If TikTok is to be banned, as the Madras High Court has directed, then the ban may be considered discriminatory because there are other services on the internet which may seem to have a harmful effect on society, and more so on children and women.

The Supreme Court which refused to transfer the pending case in Madras and adjudicate on the desirability of banning TikTok or imposing regulatory obligations may be more comfortable with passing a judgment when the government comes up with a regulation rather than suggesting a solution itself.

The government, on the other hand, may not be able to escape its responsibility in bringing some form of regulation with the objective of reducing the possible ill-effects of TikTok, and letting the Supreme Court find flaws in its regulations in an inevitable PIL that would follow.

Without a “proposed guideline”, the solution to the problem will not emerge. For this reason, it is necessary for the government to come up with some action plan to address the TikTok controversy. Perhaps as a first step, it may set up an “expert committee” to study the problem and advise it.

In this context, there are multiple issues that the government needs to consider. These include:

  • Mandating that the owners of TikTok exercise a level of due diligence that ensures that the abuse is detected and eliminated within a short time. This could mean use of moderation for posts based on artificial intelligence and manual filtering, along with quick removal of objectionable content based on post- publication feedback.
  • The government may mandate the platform owners to introduce an “objectionable/inappropriate content” tag to every video so that the public red-flags them. Following a certain minimum number of such objections, the video may be removed after a manual verification. Repeated posting of such content should lead to blacklisting the offenders.
  • The government may penalise TikTok if it is found that it is encouraging some of the abusive behaviour through promotional efforts.
  • The government may also examine and bring about a larger regulation that can address the impact of internet-based services (of which TikTok is one) on the social behaviour of vulnerable sections of society (which may include children or some adults who are irrational in their behavioural response).

Mandating responsibility for filtering the messages is well within the current interpretation of “due diligence” under Section 79 of the Information Technology Act, 2000 (ITA 2000), and does not require any change of the law. A simple guideline on the lines of what was earlier issued to cyber cafes, matrimonial websites, etc., would suffice to provide clarity. It can be applied to not only TikTok, but other gaming apps as well.

A “controller for game applications on the internet/mobiles” can be designated as a specific regulator from within the ministry of electronics and information technology to focus on the issue. Under the proposed Personal Data Protection Act, 2012, some of these game administrators can be considered as “guardian fiduciary” and will automatically be required to register themselves and be governed by the Data Protection Authority.

While discussing the amendments to the intermediary guidelines that the government proposed in December 2018, which were stalled by the opposition, a “self-regulatory framework” was suggested by the author, titled “Intermediary Dispute Management System”. It can be considered for tackling this issue where it is necessary to balance freedom of creative expression with the perceived risks of adverse impact on sections of the society.

From the legal standpoint, the Supreme Court judgment on Section 66A (Shreya Singhal Case) would pose a hurdle as this judgment advocated that a certain level of freedom of expression is to be tolerated on the social media platform, despite alleged defamation, threat, annoyance, and so on.

A larger issue, however, is that this problem, along with some dangerously addictive and harmful games such as Blue Whale and PUBG, is a social problem more than a legal one. The internet is a medium of communication and there will be many games and apps which will be created and published for different reasons. Some of these will be created by deviant minds who enjoy others being harmed.

At the same time, there will be some people in society who get attracted and addicted to harmful habits just like people taking to smoking and drinking. The law can only discourage malicious inducers to a crime by punishing them. But preventing the misuse of a platform such as TikTok where some post violent and pornographic content cannot be effectively addressed by law. It is the responsibility of society, and more particularly schools, teachers and parents, to advise children not to be victims of harmful addictions on the mobile.

As regards the adventurism of adults that may cause damage, education can help to a certain extent and punishment to a little higher extent, but the problem has to be lived with. Early detection of such symptoms should lead to counselling sessions with the appropriate experts. This could result in a reduction of the adverse impact of not only TikTok but other similar threats that loom over the internet.

It would be worth watching if the Supreme Court, Madras High Court or a likely committee of experts comes up with suggestions in this regard.

The government can also try to encourage NGOs who are dedicated to such causes by offering incentives so that they take effective action to reduce the dangers posed by TikTok and other such apps.

—The writer is a cyber law and techno-legal information security consultant based in Bengaluru