E.U. Security Agency Warns on ‘Right to be Forgotten’

Editor’s Note: European and other national/multinational regulation will increasingly impact US-based companies.  International regulatory coordination is necessary to protect the free flow of ideas and commerce.  See FISMA Focus here for more information.

From: WSJ/Tech Europe

By Nick Clayton

Ever since the European Commission published its proposals on the handling of data in January there have been concerns in particular about the so-called “right to be forgotten”, the idea that individuals should be able to have personal data removed from public access. Many of the loudest voices have come from outside the E.U. including that of Prof. Jeffrey Rosen who talked about the proposals being the “biggest threat to free speech on the Internet”.

Now, even the E.U.’s own European Network and Information Security Agency (ENISA) in a newly-published report is raising serious concerns about the privacy proposals, at the most fundamental level. For example, “personal data” is defined in the proposals as data that can be used to identify a natural person, according to InfoSecurity.

[ENISA says] it doesn’t specify whether that identification can be with a high degree of probability but not absolute certainty (such as “a picture of a person or an account of a person’s history”) or even whether it includes the identification of a person not uniquely, “but as a member of a more or less small set of individuals, such as a family.” Incidentally, this issue will only get more complex with the growth of big data analytics, where an individual may never be overtly specified, but may still become recognizable through the accumulation and association of different items.

With photographs, ENISA asks, what if more than one person is pictured? Who decides if it is to be removed? Does this also mean politicians or governments would be able to remove embarrassing news reports?

ENISA then moves to the technical difficulties involved in the right to be forgotten. It notes that in an open global system such as the web, anybody can copy any data and store it anywhere. In short, “enforcing the right to be forgotten is impossible in an open, global system, in general… [since] unauthorized copying of information by human observers is ultimately impossible to prevent by technical means.” It can only be achieved within ‘closed systems’ such as access-controlled public networks entirely within the jurisdiction of the EU.

It also looks at technical solutions such as encryption which would scramble data on a specific date or when agreed. Apart from the difficulty of managing public keys and the risk that they could be compromised and the data revealed, it does not prevent, for instance, photographs being copied from outside the closed system.

Perhaps, suggests the agency, instead of removing the data it would be sufficient simply to hide it–security through obscurity. In other words the right to be forgotten would end up as the right not to be on Google GOOG +0.26%. Just how search engines would manage such a process isn’t made clear.

InfoSecurity: Problems with the EU’s proposed ‘right to be forgotten’

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published.

Please Answer: *