Behind Closed Doors: A Reflection on Recent Data Protection and Privacy Legislation

Behind Closed Doors: A Reflection on Recent Data Protection and Privacy Legislation

by

By Walter Amoko | Gibran Darr
Some of the grim forebodings of sceptics of the techno-age to be ushered in through interconnectedness are certainly coming to bear. Thanks to the occasional whistle-blower, we learn of governments in league with tech giants who are engaged in secret massive unlawful data harvesting. Artificial intelligence and automated decision making by financial institutions remains a clear privacy concern. That an algorithm needs access to phone logs, messages and photo gallery in order to calculate a credit score still baffles many. Others intentionally analyse their customers’ on-line activity which is then commercialised through targeted advertising, while others obtain personal on-line data which is subjected to behavioural and psychosocial analysis and exploited for political gain.

However, as Shoshana Zuboff demonstrates in her recent book The Age of Surveillance Capitalism, the invidious abuse by private tech giants is all too pervasive and has become the norm rather than the exception. Governments are seemingly in on it too. In the United Kingdom, it was revealed that the National Health Service shared its medical records (excluding patient private data – the British were assured but the disclosed contract was redacted) with Amazon. Little wonder that in Tim Berners-Lee’s Contract for the Web in order to avoid digital dystopia, two (2) of the author’s ten (10) principles included respect and protection of privacy and on-line data.

Prompted by decisions of the European Court of Justice, the European Union (EU) took the lead with the General Data Protection Regulations (GDPR) which might be regarded as belated efforts to ensure such protection though it is hardly the gold standard many suppose it to be. Kenya has finally joined the bandwagon. On 8th November, 2019 the President signed into law the Data Protection Act, 2019 (the Act) which took effect on 25th November, 2019 and with it came imposed obligations on natural as well as legal persons (both Kenyan and non-Kenyan companies who collect, process and store personal data in Kenya) and public authorities and agencies to protect the personal data of Kenyan citizens.

In this article, we do not undertake an elaborate examination of the Act’s provisions –we merely highlight the Act’s potential impact on other areas of the law. At its heart, not surprisingly, is knowing affirmative consent when it comes to the collection, storage and sharing of personal data.

The Act establishes the office of the Data Commissioner and places registration requirements and obligations on “data controllers” – a natural or legal person who determines the means of processing data and “data processors” – a natural or legal person who processes personal data on behalf of the data controller. It is possible for an organisation to be both a data controller and data processor.

The Act is modelled largely on the EU’s GDPR, whose ramifications are still not fully worked out – supplying an additional arsenal for claims of breaches of reputational and privacy rights. For example, as witnessed in the Duchess of Sussex’s recent claims against the Daily Mail, which a while back, would have only sounded in defamation, the GDPR has been invoked. (To be sure, the Duchess of Sussex also relies on other forms of actions that have since been developed in the rather recent past such as misuse of private information as well as more old fashioned ones such as breach of copyright).

Thus, where a matter might have fallen exclusively within the law of defamation, it would appear that, as with the GDPR, the Act portends a potentially vast sea of change in areas previously the subject
of other laws.

The Act was passed by parliament with the intention of giving effect to Article 31 (c) and (d) of the Constitution of Kenya 2010, which protects the right to privacy of an individual. Section 3 goes on to set out the objects and purpose of the Act as being to regulate the processing of personal data; to ensure that the processing of personal data is guided by principles set out in the Act; to protect the privacy of individuals; to establish the legal and institutional mechanism to protect personal data; and to provide data subjects with rights and remedies to protect their personal data from processing that is not in accordance with the Act.

The objects of the Act are further stipulated in a list of data protection principles set out under section 25, which are that personal data should be:

  • processed in accordance with the right to privacy of the data subject
  • processed lawfully, fairly and in a transparent manner in relation to any data subject
  • collected for explicit, specified and legitimate purposes and not further processed in a manner incompatible with those purposes
  • adequate, relevant, limited to what is necessary in relation to the purposes for which it is processed
  • collected only where a valid explanation is provided whenever information relating to family or private affairs is required
  • accurate and, where necessary, kept up to date, with every reasonable step being taken to ensure that any inaccurate personal data is erased or rectified without delay
  • kept in a form which identifies the data subjects for no longer than is necessary for the purposes which it was collected
  • not transferred outside Kenya, unless there is proof of adequate data protection safeguards or consent from the data subject

The starting point of an examination of the Act within a media context is the definition of “personal data” which is defined in the interpretation section as any information relating to an “identified or identifiable natural person”. The principles set out under section 25 of the Act provide for purpose limitation and data minimization, and further espouse the principle of accuracy. The aforesaid principles, which mirror the GDPR principles, were considered by the Court of Justice of the European Union (CJEU) in the case of Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González (2014) and resulted in a victory for privacy campaigners on the “right to be forgotten” in respect of information which had become “outdated and inaccurate”.

A similar application of these principles under the Act is likely to affect the obligations of journalists, as to the accuracy and relevance of content at a later date from publication notwithstanding that the same may have been true or accurate as at the time of publication. Section 52 (1) (c) of the Act, like Article 85 of the GDPR, does provide artistic, literary and journalistic exemption to the application of data protection principles where the data controller “ reasonably believes that, in all the circumstances, compliance with the provision is incompatible with the special purposes”. It does not help that the term “special purposes” is not defined in the Act, but the wording of the section suggests that exercise of the exemption would not be as simple as relying on the exemption where content is journalistic, artistic or literary in nature and will require an objective test as to the reasonable belief of the data processor prior to publication.

Section 52 (3) of the Act provides that the Data Commissioner will prepare a Code containing practical guidance in relation to processing of personal data for literature, art and journalism. In view of the test laid out by the said provision, it would be prudent for the Code to provide practical guidance against the background of the objective considerations which section 52 (1) (c) sets up.

A good example of the interplay between the principles of statutory provisions in data protection legislation and ordinary principles
of the common law is the decision of English Court of Appeal in WM Morrison Supermarkets vs Various Claimants (2018) EWCA Civ 2399, where the Court upheld a finding that under data protection legislation – in this case the United Kingdom’s Data Protection Act, 1998 (UK DPA) – an employer can be liable for data breaches by its employees. The case involved a disgruntled employee of Morrison’s supermarket who copied and published personal data relating to thousands of its employees. The affected employees filed a suit against Morrison’s claiming damages against Morrison’s for its own breaches as well as vicarious liability in respect of their personal data which had been misused in by the disgruntled employee in breach of the UK DPA. The claim against Morrison’s for breaches of the UK DPA, failed but the High Court held that Morrison’s was vicariously liable for the statutory breach of the UK DPA committed by the disgruntled employee. Both findings turned on whether under the UK DPA either or both were a data controller. Though not a data controller, at common law, Morrison’s was vicariously liable for the acts of the rogue employee – the data controller. The appeal on the question of whether Morrison’ was vicariously liable was dismissed. The Court of Appeal rejected arguments by Morrison’s that the UK DPA is an exclusive comprehensive code thus unless provided, the
remedy was not available. Having analyzed the arguments as to silence of the liabilities for employers for breaches by employee data controllers, and the concession made that UK DPA did not exclude common law remedies, the Court of Appeal concluded:

…the concession that the causes of action for misuse of private information and breach of confidentiality are not excluded by the DPA in respect of the wrongful processing of data within the ambit of the DPA, and the complete absence of any provision of the DPA addressing the situation of an employer where an employee data controller breaches the requirements of the DPA, lead inevitably to the conclusion that the Judge was correct to hold that the common law remedy of vicarious liability of the employer in such circumstances (if the common law requirements are otherwise satisfied) was not expressly or impliedly excluded by the DPA.”

The Court of Appeal also noted that the UK DPA did not contain a provision addressing the situation of an employer, where an employee data controller breaches the requirements of the UK DPA but also did not exclude such vicarious liability by applying the principle that if the legislation had intended to exclude the application of ordinary common law principles it would have done so expressly. We cannot in this article do full justice to the reasoning of the judgment which essentially was an attempt to apply well established common law principles to legislation seeking to protect privacy to what perhaps, not too accurately has been christened the information age. The Supreme Court of England heard Morrison’s appeal from the Court of Appeal in November 2019 and will give its latest thinking on the interplay between legislation and the common law in this area.

While the subject is new, it is a perennial debate across the common law world that is set to recur. When legislation is silent is it ever appropriate to judicially supplant its provisions with principles of the common law and thus, while enhancing the remedies available, impose liability where parliament chose not to?

Search