Thursday, December 26, 2024
154,225FansLike
654,155FollowersFollow
0SubscribersSubscribe

Even with the court’s backing, a Right to be Forgotten law will be almost impossible to implement

Memories on the net are written in indelible ink. However much you want to forget and move on, even with laws permitting it, data is always available in some corner on the Internet, waiting to do damage.

By Sujit Bhar

To forget is an intellectual probability, as much as it is a legal improbability. Even restricting the debate to the cyber world, it becomes highly improbable that a person wanting all references of him or her, in pictures, writings, or in documents to be removed from the internet could ever achieve that.

The Orissa High Court, during a recent judgment, suggested that that there should be a provision for the right to permanently remove objectionable content, photos or videos from the internet. This means that if a wrong picture or video has been posted on the Internet, then the victim should have the right to demand that it be removed forever.

This would be the elusive Right to be Forgotten.

Why elusive? The Right to be Forgotten is possibly as, if not more, contentious than the controversial right to euthanasia. The right to euthanasia is unlikely to become law in India, ever, depending mostly on case-by-case judgments by the courts – as was in the case of nurse Aruna Ramchandra Shanbaug, where the Supreme Court allowed passive euthanasia which was carried out on May 18, 2015. If the Right to be Forgotten is a Human Right too, then finding a way to implement it is a Himalayan task.

The court observed that there is no law in the country regarding this right. So the court suggested that it can be included under Article 21 of the Constitution. The court said that many European countries have given this right to their citizens and the Orissa High Court is the first constitutional institution which has deemed the need of Indians to get this right.

Before venturing beyond the boundaries of the country, one should look at the intended legislations in India. The Personal Data Protection Bill, 2018 has a section ‘Right to be Forgotten’. However, the bill itself is in limbo. It was first tabled before Parliament by the Ministry of Electronics and Information Technology (law minister Ravi Shankar Prasad is also the minster here) on December 11, 2019.

It was sent back to the Joint Parliamentary Committee (JPC) for a detailed analysis, in consultation with experts and stakeholders. The committee, headed by BJP MP Meenakshi Lekhi, was tasked with finalising it on a short deadline. Probably the Covid intervened, but the Bill is pending, still.

As per the Bill’s section 27, Right to be Forgotten:

(1) The data principal shall have the right to restrict or prevent continuing disclosure of personal data by a data fiduciary related to the data principal where such disclosure:

(a) has served the purpose for which it was made or is no longer necessary;

(b) was made on the basis of consent under section 12 and such consent has since been withdrawn; or

(c) was made contrary to the provisions of this Act or any other law made by Parliament or any State Legislature.

Following caveats in sub-section 2, sub-section 3 clarifies methodology as follows:

(3) In determining whether the condition in sub-section (2) is satisfied, the Adjudicating Officer shall have regard to—

(a) the sensitivity of the personal data;

(b) the scale of disclosure and the degree of accessibility sought to be restricted or prevented;

(c) the role of the data principal in public life;

(d) the relevance of the personal data to the public; and

(e) the nature of the disclosure and of the activities of the data fiduciary, particularly whether the data fiduciary systematically facilitates access to personal data and whether the activities would be significantly impeded if disclosures of the relevant nature were to be restricted or prevented.

Technically, if it’s not the Licence Raj, there is a distinct possibility of it becoming an Adjudicating Officer Raj. Technically, your personal right isn’t for you to decide, it seems.

One hopes these rough edges are sandpapered while deciding on the final format of the Bill before it is again put before Parliament.

That, though, is the part where legal apathy and government eagerness to maintain oversight on your personal affairs is predominant. That is a political fight, and should be fought, tooth and nail.

There are improbabilities hindering progress of this thought, though. It is the reign of international laws, business interests and technology. It also has reference to India’s claim of localising data and its possible benefits.

The broader canvas

Yes, the EU has a good right to be forgotten law. But the grass on that side isn’t as green as it seems from here. That law, as a recent court ruling has pointed out, has its limitations. According to a BBC report of September 24, 2019, the EU’s top court ruled that Google does not have to apply the right to be forgotten globally. What does that mean? The ruling meant that Google only needs to remove links from its search results in Europe, following a request to the effect. It has no obligation to remove the links from anywhere else. This was the outcome of a French privacy regulator suing Google.

It started in 2015, when CNIL ordered Google to remove search result listings to pages containing damaging or false information about a person. And this should be done globally, it demanded.

To that Google created a technical solution. It introduced a geoblocking feature which prevented European users from being able to see delisted links. However, users elsewhere had no problem accessing those links.

This is an aberration of the right to be forgotten rule in the General Data Protection Regulation (GDPR), also known as the “right to erasure”, which gives a EU citizen the right and the power to demand that data about him/her be deleted.

Here, too, there seems to be a Controller Raj in place. This is evident in the following guidelines incorporated in the GDPR: Anybody can request any organisation to remove data related to him/her. This request can be verbal, or in writing. The organisation has been given a month to respond.

There is where the complications begin.

In the case in question, Google argued that the obligation could be “abused by authoritarian governments trying to cover up human rights abuses,” if they were to be applied universally, which is beyond European borders.

What Google explained was this: “Since 2014, we’ve worked hard to implement the right to be forgotten in Europe, and to strike a sensible balance between people’s rights of access to information and privacy.” This was the company’s statement, issued after the ECJ ruling. Then, quite like a snide aside, it added: “It’s good to see that the court agreed with our arguments.”

What all this means that link to, say, an objectionable photograph, outlawed and obfuscated by Google in Europe, could be accessed across The Atlantic and immediately relayed back to websites in Europe, making the original ruling of any court a joke.

In its decision to not allow obfuscation outside Europe, Google had on its side other tech giants, such as Microsoft, Wikipedia’s owner the Wikimedia Foundation, the non-profit Reporters Committee for Freedom of the Press, and the UK freedom of expression campaign group Article 19, etc.

Even ECJ adviser Maciej Szpunar had surmised that this right (the Right to be Forgotten) should be limited to Europe.

There are benefits of this ruling and Google’s decision. There are negatives too. The negatives are being overruled by a demand for transparency of information. The problem is that this strikes at the root of privacy laws around the world. When a non-profit reporters’ committee joins hands with the tech giants, there is need to introspect. How would an attempt at creating an unbiased piece of legislation based on the Orissa High Court’s recommendation look like on the world stage, or even across the country? Would such a move enable a terror organisation to forcefully remove all data related to their activities from the Internet?

The accepted policy and methodology in a terror group or drug cartel tracing is scrutinising money trails. A Right to be Forgotten, if universally applied, will stymie this effort.

On the other hand, illegal references on the net to a rape victim, her pictures and identity, should be wiped clean from all databases. Thus, yes, it will be a case-by-case reference, in which case the universality of the law would be in doubt. It will be a Controller Raj and open to the inherent corrupt practises that most Indians have been known to perpetrate.

The other option

That leaves us with the other option: end-to-end encryption, technologies that apps such as WhatsApp employ.

What is end-to-end encryption? An article in CNBC explains this well:

“End-to-end encryption is a security tool used by some apps and services — including WhatsApp, Signal and Facebook Messenger — to provide a greater level of privacy. Messages sent using this tool are encrypted before they leave the sender’s phone or computer, with a key unique to the devices at either end of an exchange. Even if they are intercepted during transmission by a hacker or a government agency, the messages are unreadable, since the only devices able to decode them are those belonging to the sender and the intended recipient.”

Why is this an option?

Hacking of messages encrypted end-to-end is very difficult. The only way such a message can come out into the open is through active participation of one of the participants. That leaves little option for unauthorised grabbing of data and uploading of such data out into the open.

What is encryption? This is a very ancient technique, widely used and developed into an art form during the World Wars, in which data, or a message, is turned into an undecipherable format. The format accepts a logic that a ‘key’ can provide at the other end. Encryption has progressed beyond simple symbols of old and is generally computer generated. There is little a third party can do in this.

There is normal (or link) encryption, in which data from sender can be deciphered by a middle party – such as a social media operator – and then encrypted again for the receiver. In this case, the deciphered data remains available with the middle party.

For end-to-end encryption, this middle party accepts no role and the cipher remains undisturbed.

As in every coin, the flip side of end-to-end encryption is that it creates a “safe space” for criminals. Hence investigating agencies such as Interpol, who look into money transfer trails, will face a wall.

The encryption part

However, all this may hit yet another legal wall soon. Encryption, in India, is not illegal so far. There is no bar on anybody encrypting any message or data for storage or transfer via any means.

But if a study published in a Carnegie Endowment journal of May 2019 is to be believed, some legislation regulating “encryption based on its perceived hindrance of lawful data collection is imminent.”

The article studies different statuses of privacy-related legislations in India and finds that the atmosphere is right (within the government) for such a legislation. It says: “The exact nature of the regulation remains undecided because of a need to balance law enforcement needs, apprehensions about the proliferation of unsecured devices, concerns about the security of digital payments and freedom of expression. Whatever the outcome of this debate, it will significantly affect India’s newly recognized fundamental right to privacy, burgeoning economic activity in cyberspace, and security architecture as a whole.”

The author of the article tries to look into government activities in cyberspace in arriving at such conclusions. He writes: “The Indian government’s present adversarial posture toward regulating online content primarily stems from a lack of capacity to address cyber and cyber-enabled offenses. This is compounded by an inability, under the Mutual Legal Assistance Treaty (MLAT), to systemically gain access to electronic evidence stored abroad. For example, encryption has often been at the core of the confrontation between Indian law enforcement and U.S. technology companies. Indian laws, especially the Information Technology Act 2000, bestow wide powers on law enforcement agencies to intercept and decrypt communications, but these powers are rarely exercised to gather electronic evidence. Instead, agencies rely on legacy search-and-seizure provisions like Section 91 of the Code of Criminal Procedure 1973, when seeking access to electronic communications.”

The author brings in the Blackberry incident, where the Indian government was trying to force the company to decrypt its internal, encrypted free messaging system. That, probably, showed the way for things to come soon.

The outcome

The Orissa High Court’s judgment, in the face of it, should hugely benefit any free-thinking member of the public. However, achieving a balanced legislation on this would be a Herculean task.

The problems, as discussed above, would be the objections that can, and will, be raised by investigation agencies, especially the NIA and the NCB in India, claiming that this would corrupt efforts to track terror funding and drug cartel money movements. This is a legitimate claim.

There will also be problems in the search for missing children. This database – especially facial recognition data – is one that needs sharing among different police forces. But a privacy law can infringe on rights of policemen to deal with minor girls’ photos and identities. How that can be handled could end in a new chapter.

The issues involved in writing a new legislation in this regard, thus, are humongous. While the intent of the Orissa High Court is commendable, any attempt at implementation may result in having to open a Pandora’s Box of troubles.

Read Also: Governor signs UP Prohibition of Unlawful Conversion of Religion Ordinance 2020

spot_img

News Update