Wednesday, 24 October 2018

The Morrisons Appeal - Vicarious Liability for Enployees' Breaches of Confidence and Statutory Duty

Royal Courts of Justice
Author Rafa Esteve
Licence Creative Commons Attribution Share Alike 4.0 International
Source Wikipedia



















Jane Lambert

Court of Appeal (Sir Terence Etherton MR and Lords Justices Bean and Flaux) Various Claimants v W M Morrison Supermarkets Plc  [2018] EWCA Civ 2339 (22 Oct 2018)

In  Various Claimants v WM Morrisons Supermarkets Plc (Rev 1) [2017] EWHC 3113 (QB), [2018] 3 WLR 691, Mr Justice Langstaff held that W M Morrisons Supermarket Plc ("Morrisons") was vicariously liable to its employees for the unauthorized act of one Skelton, an internal auditor, who had posted the names, addresses, gender, dates of birth, phone numbers (home or mobile), national insurance numbers, bank sort codes, bank account numbers and salaries of Morrisons' employees to a file sharing website. Skelton had acted as he did out of spite.  He had a grudge against Morrisons and wanted to injure the company.  The judge acknowledged at para [198] of his judgment that the effect of his judgment was to accomplish that injury and for that reason he gave the supermarket chain permission to appeal.  I commented on the case in Morrisons - Primary and Vicarious Liability for Breaches of Data Protection Act 1998 11 Dec 2017.

The defendant appealed on the following grounds:
"First, the Judge ought to have concluded that, on its proper interpretation and having regard to the nature and purposes of the statutory scheme, [the Data Protection Act 1998 ("the DPA")] excludes the application of vicarious liability. Second, the Judge ought to have concluded that, on its proper interpretation, the DPA excludes the application of causes of action for misuse of private information and breach of confidence and/or the imposition of vicarious liability for breaches of the same. Third, the Judge was wrong to conclude (a) that the wrongful acts of Mr Skelton occurred during the course of his employment by Morrisons, and, accordingly, (b) that Morrisons was vicariously liable for those wrongful acts."
By their respondents' notice, the claimants sought to uphold the judge's order on the additional ground "that, in evaluating whether there was a sufficient connection between Mr Skelton's employment and his wrongful conduct to make it right for Morrisons to be held vicariously liable, the Judge ought to have taken into account that Mr Skelton's job included the task or duty delegated to him by Morrisons of preserving confidentiality in the claimants' payroll information." The appeal came on before the Master of the Rolls and Lord Justices Bean and Flaux who heard the appeal on the 9 and 10 Oct and delivered judgment on 22 Oct 2018.

Their lorsdhips dismissed Morrisons' appeal.

As for the first and second grounds, the Court concluded at para [48] that it was clear that the vicarious liability of an employer for misuse of private information by an employee and for breach of confidence by an employee had not been excluded by the Data Protection Act 1998.  The applicable principle for determining that issue was whether, on the true construction of the statute in question,  Parliament had intended to exclude vicarious liability.  The appropriate test was:
"If the statutory code covers precisely the same ground as vicarious liability at common law, and the two are inconsistent with each other in one or more substantial respects, then the common law remedy will almost certainly have been excluded by necessary implication. As Lord Dyson said in the Child Poverty Action Group case (at [34]) the question is whether, looked at as a whole, the common law remedy would be incompatible with the statutory scheme and therefore could not have been intended to coexist with it."
Their lordships reasoned that if Parliament had intended to exclude that cause of action, it would have said so expressly. Secondly, Morrisons' counsel had conceded in her submissions that the Act had not excluded the action for breach of confidence or misuse of personal information.   Their lordships observed at [56]:
"Morrisons' acceptance that the causes of action at common law and in equity operate in parallel with the DPA in respect of the primary liability of the wrongdoer for the wrongful processing of personal data while at the same time contending that vicarious liability for the same causes of action has been excluded by the DPA is, on the face of it, a difficult line to tread."
They added at [57}:
"......  the difficulty of treading that line becomes insuperable on the facts of the present case because, as was emphasised by Mr Barnes [the claimants' counsel], the DPA says nothing at all about the liability of an employer, who is not a data controller, for breaches of the DPA by an employee who is a data controller."
The concession that the causes of action for misuse of private information and breach of confidence are not excluded by the Act in respect of the wrongful processing of data within the ambit of the statute, and the complete absence of any provision addressing the situation of an employer where an employee data controller breaches the requirements of the Act, led inevitably to the conclusion that the Mr Justice Langstaff was correct to hold that the common law remedy of vicarious liability of the employer was not expressly or impliedly excluded by the Act.

In respect of the third ground of appeal, the Court referred to the judgment of Lord Toulson in Mohamud v WM Morrison Supermarkets Plc   [2016] UKSC 11, [2016] IRLR 362, [2016] ICR 485, [2016] 2 WLR 821, [2017] 1 All ER 15, [2016] AC 677, [2016] PIQR P11, [2016] WLR(D) 109.  At para [44] Lord Toulson had asked "what functions or "field of activities" have been entrusted by the employer to the employee, or, in everyday language, what was the nature of his job?"  Next "the court must decide whether there was sufficient connection between the position in which he was employed and his wrongful conduct to make it right for the employer to be held liable under the principle of social justice which goes back to Holt CJ."  As to Lord Toulson's first question, the Court of Appeal endorsed the trial judge's finding that Morrisons had entrusted Skelton with payroll data. It was part of his job to disclose it to a third party.  He had clearly exceeded his authority but that did not matter because his wrongdoing was nonetheless closely related to the task that he had to do.  As to the second part of Lord Toulson's test. the Court endorsed the Mr Justice Langstaff's finding that there was an unbroken thread that linked his work to the disclosure,

As noted above, the trial judge had been troubled by the thought that the court was facilitating Skelton's wrongdoing.  The Court of Appeal noted at para [75] that it had not been shown any  reported case in which the motive of the employee committing the wrongdoing was to harm his employer rather than to achieve some benefit for himself or to inflict injury on a third party.  Morrisons submitted that it would be wrong to impose vicarious liability on an employer in circumstances such as this especially as there were so many potential claimants.   Their lordships had no trouble in rejecting those submissions.  Motive was irrelevant and to have held otherwise would have left thousands of hapless data subjects without remedy.

In Mohamud, Lord Toulson had remarked at paea [40] of his judgment that:
"The risk of an employee misusing his position is one of life's unavoidable facts."
The solution for employers was to insure against liability for the misdeeds of their staff.   As the Master of the Rolls put it at [78]:
"There have been many instances reported in the media in recent years of data breaches on a massive scale caused by either corporate system failures or negligence by individuals acting in the course of their employment. These might, depending on the facts, lead to a large number of claims against the relevant company for potentially ruinous amounts. The solution is to insure against such catastrophes; and employers can likewise insure against losses caused by dishonest or malicious employees. We have not been told what the insurance position is in the present case, and of course it cannot affect the result. The fact of a defendant being insured is not a reason for imposing liability, but the availability of insurance is a valid answer to the Doomsday or Armageddon arguments put forward by Ms Proops on behalf of Morrisons."
 That last paragraph will be one of the reasons why this case will appear in countless skeleton arguments and law reports in the future.  The other is the Court's analysis of the circumstances when a statutory code displaces common law remedies.

Anyone wishing to discuss this case or data protection generally should call me on 020 7404 5252 or send me a message through my contact form.

Sunday, 14 October 2018

Privacy Sandbox

Author Hyena
Reproduced with kind permission of the author
Source Wikipedia





















Jane Lambert

The Information Commissioner's Office has just carried out a consultation on creating a regulatory sandbox "to develop innovative products and services using personal data in innovative ways" (see ICO call for views on creating a regulatory sandbox on the ICO website).  The idea of a sandbox was pioneered by the Financial Conduct Authority which described it as  "a ‘safe space’ in which businesses can test innovative products, services, business models and delivery mechanisms without immediately incurring all the normal regulatory consequences of engaging in the activity in question" (see FCA Regulatory Sandbox Nov 2015). 

The FCA's idea of a sandbox for new products, services and business models proved not only feasible but popular and has been imitated by other financial services regulators around the world.  The Information Commissioner announced her intention of extending the idea to data protection in her Information Rights Strategic Plan 2017 - 2021:
"Technology goal #8: To engage with organisations in a safe and controlled environment to understand and explore innovative technology. 
  • We will establish a ‘regulatory sandbox’, drawing on the successful sandbox process that the Financial Conduct Authority has developed. The ICO sandbox will enable organisations to develop innovative digital products and services, whilst engaging with the regulator, ensuring that appropriate protections and safeguards are in place. As part of the sandbox process the ICO would provide advice on mitigating risks and data protection by design. 
  • In 2018 we will consult and engage with organisations about implementation of a sandbox."
The consultation closed on Friday but the Call for Evidence makes clear that that was only the first stage of the consultation process. There will be a more detailed proposal for consultation later in the year.

In his blog post Your views will help us build our regulatory sandbox, Chris Taylor, Head of Assurance at the ICO, set out the topics upon which he wants to hear from the public:
  • "what you think the scope of any such sandbox should be - should we focus on particular innovations, sectors or types of organisations?
  • what you think the benefits might be to working in a sandbox, whether that’s our expert input or increased reassurance for your customers or clients.
  • what mechanisms you might find most helpful in a sandbox – from adaptations to our approach, to informal steers or the provision of technical guidance – what are the tools that a sandbox might contain?
  • at what stage in the design and development process a sandbox would be most useful to you?"
Mr Taylor also made a point that applies to innovation generally and not just to data protection law.  It is often said in the USA and in some quarters in this country that red tape (which is a derogatory term for regulation) hobbles innovation and enterprise.   If that were so the USA would be the most innovative nation on earth but a glance of the Global Innovation Index  shows that it lies behind four European nations including the UK. 

The author explains that 
"privacy and innovation go hand in hand. It’s not privacy or innovation, it’s privacy and innovation – because organisations that use our sandbox won’t be exempt from data protection law."
A regulatory sandbox enables regulators to anticipate and make provision for difficulties before they arise thus rescuing sparing new technologies and businesses from the legal quagmires that dogged earlier technologies.  In those days the law reacted to new technologies often imperfectly. 

The proposed sandbox should mitigate the privacy uncertainties affecting new products, services and business models but they won't remove all. There will remain other issues such as patenting or other IP protection, licensing, competition and so firth.  I am well placed and should be glad to help fintech and other entrepreneurs or the patent and trade mark attorneys, solicitors, accountants and other professional advisers who may assist them.  Should any of them wish to discuss this article or data protection generally, they are welcome to call me on +44 (0)20 7404 5252 during office houses or send me a message through my contact form.   

Thursday, 11 October 2018

Data Protection after Brexit

Author Furfur
Licence Creative Commons Attribution Share Alike 4.0 International


























Jane Lambert

Because of the importance of its financial services industry, preserving an uninterrupted flow of personal data across national frontiers is particularly important to the UK.  At present, such flow is guaranteed by art 1 (3) of the General Data Protection Regulation (Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC ("the GDPR").  When the UK leaves the EU, the GDPR will cease to apply to  this country and the UK shall become a third country for the purposes of Chapter V of the GDPR.

Our exit from the UK will have no effect on the obligations of data controllers and processors in the UK or the rights of data subjects anywhere with regard to UK controllers and processors because s.3 (1) of the European Union (Withdrawal) Act 2018 will incorporate the GDPR into our law.  The Department for Digital, Culture, Media and Sport has confirmed in its guidance note Data Protection if there's no Brexit deal of 13 Sept 2018 that
"[i]n recognition of the unprecedented degree of alignment between the UK and EU’s data protection regimes, the UK would at the point of exit continue to allow the free flow of personal data from the UK to the EU"
though it adds that  the UK would keep this under review.  On the other hand, art 44 of the GDPR makes clear that data controllers and processors in the states that remain in the EU would be able to transmit personal data to the UK only in accordance with the provisions of Chapter V of the regulation.

Art 45 (1) of the GDPR provides:
"A transfer of personal data to a third country or an international organisation may take place where the Commission has decided that the third country, a territory or one or more specified sectors within that third country, or the international organisation in question ensures an adequate level of protection."
The  Department for Digital, Culture, Media and Sport's guidance note notes that the "European Commission has stated that if it deems the UK’s level of personal data protection essentially equivalent to that of the EU, it would make an adequacy decision allowing the transfer of personal data to the UK without restrictions." However it adds that while HM government wants to begin preliminary discussions on an adequacy assessment now, the Commission has stated that a decision on adequacy cannot be taken until the UK is a third country. 

Unless and until the Commission makes an adequacy assessment businesses in the UK must rely on one of the other provisions of Chapter V of the GDPR.  The guidance note suggests:
"For the majority of organisations the most relevant alternative legal basis would be standard contractual clauses. These are model data protection clauses that have been approved by the European Commission and enable the free flow of personal data when embedded in a contract. The clauses contain contractual obligations on you and your EU partner, and rights for the individuals whose personal data is transferred. In certain circumstances, your EU partners may alternatively be able to rely on a derogation to transfer personal data."
It recommends businesses proactively to consider what action they may need to take to ensure the continued free flow of data with EU partners.

If the British government and EU reach a withdrawal agreement in time for ratification before the 29 March 2019 there will be an implementation period in which the GDPR will continue to apply to the UK until 31 Dec 2020.  What happens after that will depend on the terms of the agreement on the future relationship between the EU and the UK.  At para 3.2.1 (8) of its while paper The Future Relationship between the United Kingdom and the European Union (Cm 9503) the government says:
"The UK believes that the EU’s adequacy framework provides the right starting point for the arrangements the UK and the EU should agree on data protection but wants to go beyond the framework in two key respects:
a. on stability and transparency, it would benefit the UK and the EU, as well as businesses and individuals, to have a clear, transparent framework to facilitate dialogue, minimise the risk of disruption to data flows and support a stable relationship between the UK and the EU to protect the personal data of UK and EU citizens across Europe; and
b. on regulatory cooperation, it would be in the UK’s and the EU's mutual interest to have close cooperation and joined up enforcement action between the UK's Information Commissioner's Office (ICO) and EU Data Protection Authorities."
It is still not clear whether the EU will agree to the white paper proposal or even whether there will be a withdrawal agreement that will allow a transitional period,

Anyone wishing to discuss this article or data protection generally should call me on 020 7404 5252 during office hours or send me a message through my contact form.

Monday, 25 June 2018

What is meant by the "Applied GDPR"

Jane Lambert











The term the "applied GDPR" is defined by s.3 (11) of the Data Protection Act 2018 as  the GDPR as applied by Chapter 3 of Part 2 of the Act.  According to s.4 (3) Chapter 3 applies to certain types of processing of personal data to which the GDPR does not apply and makes provision for a regime broadly equivalent to the GDPR to apply to such processing.   S.22 (1) of the Act provides that the  GDPR applies to the processing of personal data to which Chapter 3 applies as if its articles were part of an Act of Parliament.

Processing to which Chapter 3 applies
S.21 provides that Chapter 3 applies to:

  • automated or structured processing of personal data in the course of an activity that:
    • falls outside the scope of EU law; or
    • is carried out by a member state in relation to the EU's common foreign and security policy but does not fall within law enforcement as that is covered by Part 3 or processing by intelligence services which is covered by Part 4 (s.21 (1)); and
  • manual unstructured processing of personal data held by certain public authorities (s.21 (2)).
S.22 (1) extends the GDPR to the processing of personal data to which Chapter 3 applies as if the GDPR's articles were part of an Act of Parliament for the whole UK.   The explanatory note explains that Chapter 3 applies to manual unstructured processing of personal data held by certain public authorities because such processing was regulated by the Data Protection Act 1998 but not by the GDPR. The public authorities concerned are defined by s.21 (5) as public authorities as defined by the Freedom of Information Act 2000 or Scottish public authorities as defined by the Freedom of Information (Scotland) Act 2002.

Modifications to the GDPR
The GDPR that apply to the processing to which Chapter 3 applies are modified by Part I of Sched. 6 to the Act. That part consists of 72 paragraphs most of which modify articles of the GDPR. For instance, art 2 of the GDPR provides:

"Material scope
1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system.
2. This Regulation does not apply to the processing of personal data:
(a) in the course of an activity which falls outside the scope of Union law;
(b) by the Member States when carrying out activities which fall within the scope of Chapter 2 of Title V of the TEU;
(c) by a natural person in the course of a purely personal or household activity;
(d) by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security.
3. For the processing of personal data by the Union institutions, bodies, offices and agencies, Regulation (EC) No 45/2001 applies. Regulation (EC) No 45/2001 and other Union legal acts applicable to such processing of personal data shall be adapted to the principles and rules of this Regulation in accordance with Article 98.
4. This Regulation shall be without prejudice to the application of Directive 2000/31/EC, in particular of the liability rules of intermediary service providers in Articles 12 to 15 of that Directive."
Para 7 substitutes the following provision for art 2 of the GDPR in relation to the processing to which Chapter 3 applies:
“2  This Regulation applies to the processing of personal data to which Chapter 3 of Part 2 of the 2018 Act applies (see section 21 of that Act).”
Supplementary Provisions
As I noted in The Relationship between the Data Protection Act 2018 and the GDPR 20 June 2018, S.4 (2) of the Act provides for Chapter 2 of Part 2 to applies to the types of processing of personal data to which the GDPR applies by virtue of art 2 of the GDPR.  I discussed the provisions of Chapter 2 in my article.  Chapter 2 also applies to the applied GDPR as it applies to the GDPR by virtue of s.22 (2) but Part 2 of Sched. 6 modifies Chapter 2 of Part 2 in respect of those applied GDPR pursuant to s.22 (4) (b).

Interpretation of the Applied GDPR
S.22 (5) of the Act provides:
"A question as to the meaning or effect of a provision of the applied GDPR, or the applied Chapter 2 , is to be determined consistently with the interpretation of the equivalent provision of the GDPR, or Chapter 2 of this Part, as it applies otherwise than by virtue of this Chapter, except so far as Schedule 6 requires a different interpretation."
Rule Making Powers
S.23 (1) enables the Secretary of State to make regulations in relation to the processing of personal data to which Chapter 3 applies.

Manual Unstructured Data
S.24 makes certain modifications to the applied GDPR in relation to unstructured data held by public authorities as defined by the Freedom of Information Act 2000 or Scottish public authorities as defined by the Freedom of Information (Scotland) Act 2002.

Exemptions
Exemptions are made for manual unstructured data used in longstanding historical research by virtue of s.25, and national security and defence pursuant to  s.26, s.27 and s.28.

Further Information
Anyone wishing to discuss this article or data protection generally should call me during office hours on +44 (0)20 7404 5252 or send me a message through my contact form.

Wednesday, 20 June 2018

The Relationship between the Data Protection Act 2018 and the GDPR

Jane Lambert











As I mentioned on the index page for the Data Protection Act 2018s.1 (1) of the Act  states that the Act makes provision about the processing of personal data.  As everyone knows, most processing of personal data is subject to the GDPR but the GDPR makes many references to national law.  Even though the GDPR is directly applicable in the laws of each of the member states by virtue of  art 288 of the Treaty on the Functioning of the European Union, the GDPR needs to be supplemented by national legislation to function effectively.  That is why s.1 (3) provides that Part 2 of the Act supplements the GDPR.

The Legislative Scheme
S.1 (1) and (2) are amplified by s.2 (1) which provides:
"The GDPR, the applied GDPR and this Act protect individuals with regard to the processing of personal data, in particular by—
(a) requiring personal data to be processed lawfully and fairly, on the basis of the data subject’s consent or another specified basis,
(b) conferring rights on the data subject to obtain information about the processing of personal data and to require inaccurate personal data to be rectified, and
(c) conferring functions on the Commissioner, giving the holder of that office responsibility for monitoring and enforcing their provisions."
S.4 (2) adds that Chapter 2 of Part 2 applies to the types of processing of personal data to which the GDPR applies by virtue of art 2 and that that Chapter supplements, and must be read with, the GDPR.

Understanding the Scheme
Probably the best way to understand the scheme is to take an example. 

Art 5 of the GDPR  sets out a number of principles for the processing of personal data.  The first of those principles is that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject.  Art 6 (1) stipulates that processing shall be lawful only if and to the extent that one or more specified circumstances apply. One of those circumstances is that processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (point "e").

What constitutes the public interest and official authority are matters for the legislatures of the member states.   S.8 of the Data Protection Act 2018 provides:
"In Article 6 (1) of the GDPR (lawfulness of processing), the reference in point (e) to processing of personal data that is necessary for the performance of a task carried out in the public interest or in the exercise of the controller’s official authority includes processing of personal data that is necessary for—
(a) the administration of justice,
(b) the exercise of a function of either House of Parliament,
(c) the exercise of a function conferred on a person by an enactment or rule of law,
(d) the exercise of a function of the Crown, a Minister of the Crown or a government department, or
(e) an activity that supports or promotes democratic engagement."
There are similar supplementary provisions on such matters as children's consent, special categories of personal data, powers to make regulations on the fees that can be charged by data controllers in exceptional circumstances, exemptions and transfers abroad.

Further Information
Should anyone wish to discuss this article or data protection generally, he or she should call me on 020 7404 5252 during office hours or send me a message through my contact form.

Monday, 11 June 2018

The Data Protection Act 2018 - repealing the 1998 Act and applying the GDPR

Jane Lambert


As everyone knows, the GDPR (Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) repealed and replaced the Data Protection Directive (Directive 94/46/EC) with effect from 25 May 2018.

Even though it repealed the Directive which was implemented into English and Welsh, Scottish and Northern Irish law by the Data Protection Act 1998, the GDPR did not automatically repeal the 1998 Act although the doctrine of the primacy of EU law recognized by the House of Lords in R (Factortame Ltd) v Secretary of State for Transport (No 2) [1991] [1990] UKHL 13, [1991] 1 Lloyd's Rep 10, [1991] 1 AC 603, [1991] 1 All ER 70, [1990] 3 WLR 818, [1991] AC 603, (1991) 3 Admin LR 333, [1990] 3 CMLR 375 would have had that practical effect.

For the avoidance of any doubt, Parliament passed the Data Protection Act 2018 which received royal assent on 23 May 2018, which was two days before the General Data Protection Regulation ("GDPR") was due to take effect.  The introductory text describes the newAct as:
"An Act to make provision for the regulation of the processing of information relating to individuals; to make provision in connection with the Information Commissioner’s functions under certain regulations relating to information; to make provision for a direct marketing code of practice; and for connected purposes."
It consists of 215 sections and 20 schedules.  It is intended to supplement the GDPR and implement the Data Protection Law Enforcement Directive (Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA).

Because the Act received royal assent only days before the GDPR was due to come into effect, the following provisions came into effect immediately:
The very next day, Margot James MP, Minister for State at the Department for Digital, Culture, Media and Sport, signed The Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018 SI 2018 No 625. Reg 2 (1) of those Regulations brought the following provisions of the Data Protection Act 2018 into effect from 25 May 2018:
It will be seen that most of the Act is already in force and the few provisions that are not will come into force on 23 July 2018.

The provisions that repeal most of the 1998 Act are s.211 (1) (a) and para 44 of Sched. 19 of the Data Protection Act 2018.  S.111 (1) (a) provides:
"In Schedule 19—
(a)  Part 1 contains minor and consequential amendments of primary legislation ..."
Para 44 of Sched. 19 adds:
"The Data Protection Act 1998 is repealed, with the exception of section 62 and paragraphs 13, 15, 16, 18 and 19 of Schedule 15 (which amend other enactments)."
There are of course transitional and provisional measures that I shall address when occasion demands.

Anyone wishing to discuss this article, GDPR or data protection generally may call me on +44 (0)20 7404 5252 during office hours or send me a message through my contact form.

Friday, 20 April 2018

Right to be Forgotten - NT1 and another v Google

Royal Courts of Justice
Author Rafa Esteve
Licence Creative Commons Attribution Share Alike 4.0 International
Source Wikipedia

















Jane Lambert

Queen's Bench Division (Mr Justice Warby) NT 1 and NT 2 v Google LLC [2018] EWHC 799 (QB) (13 Apr 2018)

These were claims by two businessmen known respectively as NT1 and NT2 against Google LLC.  Both claimants had been  convicted of criminal offences but, as their convictions had been spent, they sought orders requiring details of their offending, convictions and sentences to be removed from Google's search results on the grounds that the information was inaccurate, old, out of date, irrelevant, of no public interest, and/or otherwise an illegitimate interference with their human rights. The claimants also sought compensation from Google for continuing to return search results disclosing such details, after those complaints had been brought to its notice. The actiod to block one link on 7 Oct 2014 but declined to block any of the others. NT1 asked Google to reconsider his request, but Google stood by its position. On 26 Jan 2015, NT1's solicitors wrote to Google requiring it to cease processing links to 2 media reports. In April 2015, Google refused. NT1 brought these proceedings on 2 Oct 2015 seeking orders for the blocking and/or erasure of links to the 2 media reports, an injunction to prevent Google from continuing to return such links, and financial compensation. In December 2017, NT1 expanded his claim to cover a third link, relating to a book extract covering the same subject-matter, in similar terms.

NT2's Claim
His lordship summarized NT2's claim at para [7]:
"In the early 21st century, when he was in his forties, NT2 was involved in a controversial business that was the subject of public opposition over its environmental practices. Rather more than ten years ago he pleaded guilty to two counts of conspiracy in connection with that business, and received a short custodial sentence. The conviction and sentence were the subject of reports in the national and local media at the time. NT2 served some six weeks in custody before being released on licence. The sentence came to an end over ten years ago. The conviction became "spent" several years ago. The original reports remained online, and links continued to be returned by Google Search. NT2's conviction and sentence have also been mentioned in some more recent publications about other matters, two of them being reports of interviews given by NT2. In due course, NT2 asked Google to remove such links."
NT2's solicitors submitted a delisting request on 14 Apr 2015. It related to 8 links. Google responded promptly by email, on 23 Apr 2015 declining to delist, saying that the links in question "relate to matters of substantial public interest to the public regarding [NT2's] professional life". On 24 June 2015, NT2's solicitors sent a letter of claim. On 2 Oct 2015 they issued proceedings, claiming relief in respect of the same 8 links as NT2. In the course of the proceedings, complaints about a further 3 links were added to the claim. The claim advanced by NT2 therefore relates to 11 items.

The Issues
The judge summarized the issues in dispute in each case as follows at para [9]:
"(1) whether the claimant is entitled to have the links in question excluded from Google Search results either
(a) because one or more of them contain personal data relating to him which are inaccurate, or
(b) because for that and/or other reasons the continued listing of those links by Google involves an unjustified interference with the claimant's data protection and/or privacy rights; and 
(2) if so, whether the claimant is also entitled to compensation for continued listing between the time of the delisting request and judgment."
His lordship added:
"Put another way, the first question is whether the record needs correcting; the second question is whether the data protection or privacy rights of these claimants extend to having shameful episodes in their personal history eliminated from Google Search; thirdly, there is the question of whether damages should be paid."
The judge noted at para [10] that these were novel questions that had never been considered by the courts. They arose in a legal environment which was complex and had developed over time.

The Legal Framework
At para [13] of his judgment Mr Justice Warby set out the legal framework:
  1. The European Convention on Human Rights ("the Convention") and in particular art 8 and art 10;
  2. S.3 (1) of the European Communities Act 1972 requiring courts in the UK to make decisions on matters of EU law in accordance with decision of the Court of Justice of the European Union ("CJEU");
  3. The Rehabilitation of Offenders Act 1974 which provide for certain convictions to be spent after specified periods of time;
  4. The Data Protection Directive and in particulars arts 2, 6, 8, 9, 12, 14, 23 and 29;
  5. The Data Protection Act 1998 and its implementing regulations;
  6. The Human Rights Act 1998 which imported the Convention into English law;
  7. The decisions of the House of Lords in Campbell v MGN Ltd [2004] AC 457, [2004] EMLR 15, [2004] 2 AC 457, [2004] UKHRR 648, [2004] 2 All ER 995, [2004] HRLR 24, [2004] UKHL 22, 16 BHRC 500, [2004] 2 WLR 1232 and Re S (a child) [2004] UKHL 47, [2004] 3 WLR 1129, [2004] 4 All ER 683, [2004] 3 FCR 407, [2005] AC 593, [2005] HRLR 5, 17 BHRC 646, [2005] EMLR 2, [2005] Crim LR 310, [2005] 1 FLR 591, [2005] EMLR 11, [2005] 1 AC 593, [2005] UKHRR 129;
  8. The Charter of Fundamental Rights of the European Union (OJ 18.12.2000 C 364/1);
  9. The decision of the CJEU in C-131/12 Mario Costeja Gonzalez v Google Spain and another EU:C:2014:317, [2014] 3 WLR 659, [2014] EUECJ C-131/12, [2014] All ER (EC) 717, [2014] EMLR 27, [2014] 3 CMLR 50, [2014] ECDR 16, [2014] 2 All ER (Comm) 301, ECLI:EU:C:2014:317, [2014] 1 QB 1022, 36 BHRC 589, [2014] QB 1022; and
  10. The General Data Protection Regulation ("CDPR") and in particular art 17.
NT1's Contentions
NT1 contended that Google was a "data controller" within the meaning of s.1 (1) of the Data Protection Act 1998 and that it owed him a duty under s.4 (4) to process data relating to him in accordance with the "data protection principles" as set out in Sched. 1 to the  Act.  He complained that Google had breached the 1st, 4th, 5th and 6th principles:
"1.  Personal data shall be processed fairly and lawfully and, in particular, shall not be processed unless
(a)   at least one of the conditions in Schedule 2 is met, and
(b)   in the case of sensitive personal data, at least one of the conditions in Schedule 3 is also met.
.......................................
4.   Personal data shall be accurate and, where necessary, kept up to date.
5.   Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes.
6.   Personal data shall be processed in accordance with the rights of data subjects under this Act."
NT1 alleged that Google had breached the 4th data protection principle by linking to 6 articles that contained inaccuracies.  In the alternative he argued that linking to those articles breached one or more of the other principles.

Google's Response to NT1
Google admitted that it was a data controller, that it owed a duty to comply with the data protection principles, that there were inaccuracies in the 6 articles and that it had to balance the interests of the data subject against those of public in accordance with the CJEU's judgment in the Google Spain case but offered the following three-pronged defence.  First, it argued that NT1's claim was an abuse of the process of the court on the ground that NT1 was using the Data Protection Act 1998 to obtain relief for damage to reputation that would be unavailable in defamation proceedings by reason of s.8 of the Rehabilitation of Offenders Act 1974.  Secondly, it contended that it had carried out properly the balancing exercise prescribed by the CJEU in Google Spain.  Thirdly, it relied on s.32 of the 1998 Act which provides a limited exemption for journalistic purposes.

The Judgment on NT1's Claim
Though he rejected Google's abuse of process argument and its journalistic purpose defence, Mr Justice Warby found for Google,

As to the abuse of process argument, his lordship agreed that NT1 had brought this action in order to protect his reputation but he was also relying on the CJEU's decision in Google Spain. He accepted NT1's submission that "the Court should not be too liberal in its labelling of prejudice as 'injury to reputation', lest it undermine the Google Spain regime."

He rejected the s.32 defence on the ground that Google's processing of personal data was not only for journalistic, literary or artistic purposes as required by s.32 (1).  Google was a search engine and as such it processed personal data for all sorts of purposes.  Secondly, s.32 (1) (b) and (c) required "reasonable belief" on Google's part that publication would be in the public interest and that complying with the data protection principles would be incompatible with the journalistic, literary or artistic purposes. The judge could find no evidence that Google had given any thought to the public interest.

At paragraph [93] of his judgment, the judge considered whether the 4th data protection principle had been breached.  He found a few inaccuracies in the articles to which NT1 objected but they were not so serious as to give a false impression of the crimes of which he had been convicted.  Some of NT1's complaints had been insufficiently pleaded.  Other complains were not supported by the evidence.  Even where there were inaccuracies the judge was not persuaded to make any of the orders sought or to award compensation.

As to the other alleged breaches his lordship held that the public interest outweighed the case for delisting. He explained his reasoning at para [170]:
"The information retains sufficient relevance today. He has not accepted his guilt, has misled the public and this Court, and shows no remorse over any of these matters. He remains in business, and the information serves the purpose of minimising the risk that he will continue to mislead, as he has in the past. Delisting would not erase the information from the record altogether, but it would make it much harder to find. The case for delisting is not made out."
For much the same reason he held that the public interest outweighed NT1's reasonable  expectation of privacy under art 8 of the Convention.

The Judgment on NT2's Claim
Although the issues in his case were much the same as in NT1's, NT2 impressed the judge as "an honest and generally reliable witness who listened carefully to the questions put to him, and gave clear and relevant answers." The one article, or item, of which the claimant complained was not a contemporary report of the conviction or sentencing. It had appeared in a national newspaper over 8 years after NT2 had been sentenced.  The judge found at para [190] that the article was inaccurate and gave a misleading complaint as to the claimant's criminality. Because of the inaccuracy the judge was prepared to make a delisting order.

After performing the Google Spain balancing exercise, the judge concluded at [223]:
"My key conclusions in respect of NT2's delisting claim are that the crime and punishment information has become out of date, irrelevant and of no sufficient legitimate interest to users of Google Search to justify its continued availability, so that an appropriate delisting order should be made. The conviction was always going to become spent, and it did so in March 2014, though it would have done so in July of that year anyway. NT2 has frankly acknowledged his guilt, and expressed genuine remorse. There is no evidence of any risk of repetition. His current business activities are in a field quite different from that in which he was operating at the time. His past offending is of little if any relevance to anybody's assessment of his suitability to engage in relevant business activity now, or in the future. There is no real need for anybody to be warned about that activity."
As to whether NT2 had a reasonable expectation of privacy under art 8 of the Convention, Mr Justice Wardle  said at [226]:
"The impact on the claimant is such as to engage Article 8. The business prejudice does not suffice for that purpose, but there is just enough in the realm of private and family life to cross the threshold. The existence of a young, second family is a matter of some weight. Even so, the evidence does not, in the end, demonstrate a grave interference. But it is enough to require a justification. Google's case on relevance is very weak. The claimant's evidence suggests that he has acknowledged his past error. The claimant's current and anticipated future business conduct does not make his past conduct relevant to anybody's assessment of him, or not significantly so. Continued accessibility of the information complained of is hard to justify. The factors that go to support that view are weak, by comparison with those that weigh in favour of delisting."
Though the judge decided to make a delisting order he was not persuaded to award compensation as he considered that Google had acted with reasonable care in dealing with NT2's request.

Comment
These are two cases with very similar issues and arguments but significantly different facts  NT2's wrongdoing was of a lesser order than NT1's. He had expressed contrition. The article of which NT2 complained had been inaccurate and misleading whereas those of which NT1 complained were not.  Unlike NT1, NT2 was trying to rebuild his life in a different business  where there was no danger of his repeating his wrongdoing. He was therefore reasonably entitled to privacy.  That is why the balance tipped in NT2's favour but not NT1's.

The judgment is useful in that it lists the authorities to which the court will have regard in future cases and the methodology to be applied in Google Spain cases.  Save that courts will cease to consider the Data Protection Directive and the Data Protection Act 1998 as part of the legal framework after 25 May 2018 the approach to issues of this kind will probably be the same under the GDPR.

Finally, Mr Justice Wardle's decision has been widely reported as a defeat for Google (eg Google loses "right to be forgotten" case 13 Apr 2018), that is not completely true. Google was completely successful in NT1's case and resisted the claim for compensation in NT2's.

Should amplification or clarification be required, call me on 020 7404 5252 during office hours or send me a message through my contact form.

Tuesday, 27 March 2018

Information Commissioner's Charges after GDPR

Bank of England
Author Adrian Pingstone
Licence Copyright waived by owner
Source Wikipedia























Jane Lambert

The General Data Protection Regulation ("GDPR") imposes a number of new obligations on data controllers but it does not require them to pay any money unlike the Data Protection Act 1984 and the Data Protection Act 1998.  A small but very welcome concession in exchange for responsibilities that will increase the costs of compliance one might have thought.

Fat chance! Our own Parliament has passed the Digital Economy Act 2017 section 108 (1) of which enables the Secretary of State to make regulations that "require data controllers to pay charges of an amount specified in the regulations to the Information Commissioner." The government has now published draft regulations under that provision known as The Data Protection (Charges and Information) Regulations 2018 which will come into effect on 25 May 2018. The Explanatory Note  states that they will replace The Data Protection (Notification and Notification Fees) Regulations 2000 SI 2000 No 188.

According to the Information Commissioner;s press release, this legislation has been enacted because the government has a statutory duty to ensure that the Information Commissioner's Office is adequately funded (see New model announced for funding the data protection work of the Information Commissioner’s Office 21 Feb 2018 ICO's News and Blogs). They have a point there.  I for one was heartened by photos of ICO investigators doing their job in relation to recent personal data misuse allegations (see Investigators complete seven-hour Cambridge Analytica HQ search 24 March 2018 The Guardian).

The amount of the new charges is set out in reg 3 (1):
"For the purposes of regulation 2 (2), the charge payable by a data controller in—
(a) tier 1 (micro organisations), is £40;
(b) tier 2 (small and medium organisations), is £60
(c) tier 3 (large organisations), is £2,900."
To qualify as a "micro organisation" a business must:
(i)  have a turnover of less than or equal to £632,000 for the data controller’s financial year,
(ii) no more that 10 members of staff;
(iii) be a charity, or
(iv) be a small occupational pension scheme.
As micro organizations will be offered a £5 discount if they pay by direct debit, the new rules will not increase their payments at all if they take advantage of the concession.   For businesses in tier 3 there will be a massive increase from £500 to £2,900 per year.   The new rates are intended to reflect the relative risk for each category of data controller.

Anyone wishing to discuss these rules or data protection in general should call me on 020 7404 5252 during office hours or send me a message through my contact form.

Friday, 23 March 2018

Consent to Processing of Personal Data

Author The Opte Project
Licence CC BY 2.5
Source Wikimedia Commons


























Jane Lambert

One of the questions I am asked most frequently whenever I give a talk on the General Data Protection Regulation  ("GDPR") is whether it is necessary to seek renewed consent from existing subscribers to newsletters and other services. That ties up  with something else that has happened over the last few days.  I have received several requests to renew subscriptions to newsletters and other online services that I have used for years.

The reason for that most frequently asked question is that art 5 (2) of the  Regulation requires data controllers not only to comply with data protection principles that have existed in one form or another in every previous data protection statute as well as the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data and the Council of Europe Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data but also to demonstrate compliance with those principles. The first of those principles is that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject.  Art 6 (1) of the GDPR provides 6 grounds for the lawful processing of personal data one of which is that "(a)  the data subject has given consent to the processing of his or her personal data for one or more specific purposes" (art 6 (1) (a) GDPR). Data controllers have focused on that ground because it is easiest to prove. 

However, such consent must be freely given, specific, informed and unambiguous.  Art 7 (1) provides:
"Where processing is based on consent, the controller shall be able to demonstrate that the data subject has consented to processing of his or her personal data."
Art 7 (4) adds:
"When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract. 
Paragraph 32 of the recitals explains the policy for this requirement:
"Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject's agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement. This could include ticking a box when visiting an internet website, choosing technical settings for information society services or another statement or conduct which clearly indicates in this context the data subject's acceptance of the proposed processing of his or her personal data. Silence, pre-ticked boxes or inactivity should not therefore constitute consent. Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. If the data subject's consent is to be given following a request by electronic means, the request must be clear, concise and not unnecessarily disruptive to the use of the service for which it is provided."
To ensure that the consent is informed, paragraph 42 adds:
"Where processing is based on the data subject's consent, the controller should be able to demonstrate that the data subject has given consent to the processing operation. In particular in the context of a written declaration on another matter, safeguards should ensure that the data subject is aware of the fact that and the extent to which consent is given. In accordance with Council Directive 93/13/EEC a declaration of consent pre-formulated by the controller should be provided in an intelligible and easily accessible form, using clear and plain language and it should not contain unfair terms. For consent to be informed, the data subject should be aware at least of the identity of the controller and the purposes of the processing for which the personal data are intended. Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment."
Paragraph 43 adds:
"In order to ensure that consent is freely given, consent should not provide a valid legal ground for the processing of personal data in a specific case where there is a clear imbalance between the data subject and the controller, in particular where the controller is a public authority and it is therefore unlikely that consent was freely given in all the circumstances of that specific situation. Consent is presumed not to be freely given if it does not allow separate consent to be given to different personal data processing operations despite it being appropriate in the individual case, or if the performance of a contract, including the provision of a service, is dependent on the consent despite such consent not being necessary for such performance."
Pausing there, it is clear from paragraph 42 of the recitals that consent does not have to be in writing but it does have to be recorded if it is to be proved.  Art 7 (2) indicates that consent can be sought on a form that contains other matter.  However, if it is, the part  relating to consent must be clear and cover all the purposes for which the data is to be processed.  If the data are to be processed for more than one purpose, then the data subject's consent must be obtained separately for each of those processes.  Art 7 (3) of the GDPR entitles a data subject to withdraw his or her consent at any time.  Data subjects should be advised of their right to withdraw their consent at any time before they give it. It should not be more difficult to withdraw consent than it is to give consent.  Where the data controller and data subject have unequal bargaining power, the data controller should avoid using (or even giving the impression of using) its leverage to extract a data subject;'s consent.

Nothing in the GDPR suggests that consent has to be obtained or renewed specifically to comply with the Regulation but any consent that has been obtained in the past must have met the Regulation's conditions.  Indeed, paragraph 171 of the recitals states that where processing is based on consent pursuant to the existing law, it is not necessary for the data subject to give his or her consent again if the manner in which the consent has been given is in line with the conditions of the GDPR, so as to allow the controller to continue such processing after the date of application of this Regulation.

The last sentence of art 7 (2) provides:
"Any part of such a declaration which constitutes an infringement of this Regulation shall not be binding."
In other words, a data controller cannot rely on a data user's consent unless the above as a defence to any administrative action, civil claim or criminal prosecution unless the above conditions have been complied with,  On the other hand, if a data subject who has validly given his or her consent subsequently withdraws it, art 7 (3) makes clear that the withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal.

If a data subject is under the age of 16 (or such other age between 13 and 16 that a member state may set) art 8 (1) requires consent to be obtained from the person having parental responsibility for that data subject.  Art 8 (2) requires the data controller to make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.

Anyone wishing to discuss this article or data protection in general should call me on +44 (0)20 7404 5252 during office hours or message me through my contact form.

Wednesday, 7 February 2018

Judicial Remedies under the GDPR and other Data Protection Legislation

Jane Lambert











A lot of attention has focused on the massive increase in the Information Commissioner's and other supervisory authorities' power to fine under art 83 (4) and (5) of the GDPR but she acquires no new powers to compensate.  If a data subject requires compensation from a data controller or processor under art 82 (1) or some other judicial remedy pursuant to art 79, he or she will have to sue.

The Data Protection Bill, which has now completed its passage through the Lords and is now awaiting its second reading in the House of Commons, makes provision for that judicial remedy.  The courts of the United Kingdom are to have the power to make compliance orders under clause 165 and award compensation under clause 166 and clause 167.

Clause 165 (2) defines a compliance order as
"an order for the purposes of securing compliance with the data protection legislation which requires the controller in respect of the processing, or a processor acting on behalf of that controller—
(a) to take steps specified in the order, or
(b) to refrain from taking steps specified in the order."
This would seem to include an order by the court to a data controller to comply with a subject access request under clause 94 (11), an order not to process personal data under clause 99 (5) and rectification and erasure under clause 100 (4). Though there is no specific provision in the Bill for the court to restrain the transfer of personal data abroad under clause 109 (1) or to order a controller to take steps to implement the data protection principles or minimize the risks to the rights and freedoms of data subjects under clause 103 (2) there seems to be no reason why it should not do so.

As I mentioned in Claims by Data Subjects against Data Controllers and Processors under the GDPR 5 Jan 2018, the provisions relating to subject access, rectification and erasure stipulate that the High Court of England and Wales has exclusive jurisdiction to make such orders. However, there seems to be a contradiction in that clause 177 (1) and (2) seems to suggest that compliance orders as well as compensation may be awarded by the County Court as well as the High Court.

Clause 166 (1) provides for compensation for material or non-material damage including distress under art 82 GDPR for contravention of that regulation and clause 167 (1)  for compensation for material or non-material damage including distress under any other data protection legislation.

In future articles I shall discuss pleading claims  for judicial remedies for alleged breaches of the GDPR and other legislation and possible defences.  Anyone wishing to discuss this article should call me on 020 7404 5252 during office hours or send me a message through my contact form.

Sunday, 14 January 2018

Information Commissioner fines The Carphone Warehouse £400,000 for breaching the Seventh Data Protection Principle










Jane Lambert

In GDPR - Fines 7 Dec 2017 I outlined the Information Commissioner's existing powers under s.55A of the Data Protection Act 1998 and The Data Protection (Monetary Penalties) (Maximum Penalty and Notices) Regulations 2010 to impose monetary penalties on data controllers who contravene s.4 (4) of the Act. As I noted in that article, the maximum penalty that the Commissioner can impose is limited to £500,000 by reg 2 of those Regulations.

By a monetary penalty notice dated 8 Jan 2018 the Information Commissioner fined the Carphone Warehouse £400,000 (80% of the maximum under reg 2) for failing to prevent unauthorized access to the personal data of over 3 million of its customers and some 1,000 of its employees. 

Paragraph 7 of Sched. 1 of the Act provides:
"Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data."
Paragraphs 9 to 12 of the schedule add:
"The seventh principle
9. Having regard to the state of technological development and the cost of implementing any measures, the measures must ensure a level of security appropriate to—
(a)   the harm that might result from such unauthorised or unlawful processing or accidental loss, destruction or damage as are mentioned in the seventh principle, and
(b)   the nature of the data to be protected.
10. The data controller must take reasonable steps to ensure the reliability of any employees of his who have access to the personal data.
11. Where processing of personal data is carried out by a data processor on behalf of a data controller, the data controller must in order to comply with the seventh principle—
(a)   choose a data processor providing sufficient guarantees in respect of the technical and organisational security measures governing the processing to be carried out, and
(b)   take reasonable steps to ensure compliance with those measures.
12. Where processing of personal data is carried out by a data processor on behalf of a data controller, the data controller is not to be regarded as complying with the seventh principle unless—
(a) the processing is carried out under a contract—
(i)      which is made or evidenced in writing, and
(ii)     under which the data processor is to act only on instructions from the data controller, and
(b) the contract requires the data processor to comply with obligations equivalent to those imposed on a data controller by the seventh principle."
Based on evidence that had been submitted by the Carphone Warehouse which included reports by forensic specialists, the Commissioner found at paragraph 22 that the data controller had contravened the above data protection principle in 11 respects ranging from the use of out of date software to inadequate vulnerability scanning.  Having regard to the state of technological development, the cost of implementing any measures, the nature of the relevant personal data and the harm that might ensue from its misuse, the Commissioner's held was that there were multiple inadequacies in Carphone Warehouse's technical and organisational measures for ensuring the security of personal data on the System.

The Commissioner concluded that the requirements of s.55A (1) had been met. After considering both aggravating and mitigating factors she fixed the penalty at £400,000 to be paid by the 8 Feb 2018.  She offered the data controller a 20% discount if it pays the fine in full by 7 Feb 2018 and does not appeal. If it exercises its right of appeal it will forego the £80,000 discount. That leaves a very difficult decision for The Carphone Warehouse and its lawyers. If the company accepts the Commissioner's finding it risks claims for compensation in the civil courts by any one or more of its 3 million customers and 1,000 employees. On the other hand it will not be easy to appeal and the costs could well exceed £320,000.

Should anyone wish to discuss this note or data protection generally, he or she should call me on 020 7404 5252 during normal business hours or send me a message through my contact form.