Friday, 4 October 2019

Lloyd v Google LLC

Author Gciriani
Licence CC BY-SA 4.0
Source Wikipedia Google























Jane Lambert

Court of Appeal (Dame Victoria Sharp P, Sir Geoffrey Vos C, Lord Justice Davis) Lloyd v Google LLC [2019] EWCA Civ 1599 (2 Oct 2019)

This was an appeal against Mr Justice Warby's refusal to allow the claimant, Richard Lloyd ("Mr Lloyd"), to serve proceedings on Google LLC ("Google") outside the jurisdiction claiming damages on behalf of 4 million i-phone users for  allegedly tracking secretly their internet activity for commercial purposes between 9 Aug 2011 and 15 Feb 2012.  The appeal was heard on 16 and 17 July 2019 by the President of the Queen's Bench Division, the Chancellor and Lord Justice Davis. Judgment was given on 2 Oct 2019. The lead judgment was delivered by the Chancellor, Sir Geoffrey Vos.

The Issues
The facts in this appeal were very similar to those in Google Inc v Vidal-Hal and others [2015] 3 WLR 409, [2015] CP Rep 28, [2015] FSR 25, [2015] 3 CMLR 2, [2015] WLR(D) 156, [2015] EMLR 15, [2015] EWCA Civ 311, [2016] QB 1003, [2016] 2 All ER 337 where the Court of Appeal dismissed Google's appeal against Mr Justice Tugendhat's decision to allow a similar claim to be served  outside the jurisdiction (see Vidal-Hall and others v Google Inc [2014] EWHC 13 (QB) (16 Jan 2014)  [2014] EMLR 14, [2014] 1 WLR 4155, [2014] WLR 4155, [2014] FSR 30, [2014] 1 CLC 201, [2014] WLR(D) 21, [2014] EWHC 13 (QB)). However, the Chancellor pointed out at paragraph [3] of his judgment that there was one crucial difference between the two cases.   In Vidal-Hall, the individual claimants claimed damages for distress as a result of Google's breaches of the Data Protection Act 1998 ("DPA").  In the present case, Mr Lloyd claimed a uniform amount by way of damages on behalf of each person within the defined class without seeking to allege or prove any distinctive facts affecting any of them, save that they did not consent to the abstraction of their data.

The Chancellor analysed Mr Justice Warby's decision between paragraphs [25] and [39] of his judgment.  According to the Chancellor, the grounds on which the application had been refused were  "that: (a) none of the represented class had suffered 'damage' under section 13 of the Data Protection Act 1998 (the 'DPA'), (b) the members of the class did not anyway have the 'same interest' within CPR Part 19.6 (1) so as to justify allowing the claim to proceed as a representative action, and (c) the judge of his own initiative exercised his discretion under CPR Part 19.6 (2) against allowing the claim to proceed."

His lordship summarized the main issues raised by the appeal as follows:
"(a) whether the judge was right to hold that a claimant cannot recover uniform per capita damages for infringement of their data protection rights under section 13 of the DPA, without proving pecuniary loss or distress, (b) whether the judge was right to hold that the members of the class did not have the same interest under CPR Part 19.6 (1) and were not identifiable, and (c) whether the judge's exercise of discretion can be vitiated."
The Facts
Sir Geoffrey adopted the following paragraphs from Mr Justice Warby's judgment:
"[7]. The case concerns the acquisition and use of browser generated information or "BGI". This is information about an individual's internet use which is automatically submitted to websites and servers by a browser, upon connecting to the internet. BGI will include the IP address of the computer or other device which is connecting to the internet, and the address or URL of the website which the browser is displaying to the user. As is well-known, "cookies" can be placed on a user's device, enabling the placer of the cookie to identify and track internet activity undertaken by means of that device.
[8]. Cookies can be placed by the website or domain which the user is visiting, or they may be placed by a domain other than that of the main website the user is visiting ("Third Party Cookies"). Third Party Cookies can be placed on a device if the main website visited by the user includes content from the third party domain. Third Party Cookies are often used to gather information about internet use, and in particular sites visited over time, to enable the delivery to the user of advertisements tailored to the interests apparently demonstrated by a user's browsing history ("Interest Based Adverts").
[9]. Google had a cookie known as the "DoubleClick Ad cookie" which could operate as a Third Party Cookie. It would be placed on a device if the user visited a website that included content from Google's Doubleclick domain. The purpose of the DoubleClick Ad cookie was to enable the delivery and display of Interest Based Adverts.
[10]. Safari is a browser developed by Apple. At the relevant time, unlike most other internet browsers, all relevant versions of Safari were set by default to block Third Party Cookies. However, a blanket application of these default settings would prevent the use of certain popular web functions, so Apple devised some exceptions to the default settings. These exceptions were in place until March 2012, when the system was changed. But in the meantime, the exceptions enabled Google to devise and implement the Safari Workaround. Stripped of technicalities, its effect was to enable Google to set the DoubleClick Ad cookie on a device, without the user's knowledge or consent, immediately, whenever the user visited a website that contained DoubleClick Ad content.
[11]. This enabled Google to identify visits by the device to any website displaying an advertisement from its vast advertising network, and to collect considerable amounts of information. It could tell the date and time of any visit to a given website, how long the user spent there, which pages were visited for how long, and what ads were viewed for how long. In some cases, by means of the IP address of the browser, the user's approximate geographical location could be identified. Over time, Google could and did collect information as to the order in which and the frequency with which websites were visited. It is said by the claimant that this tracking and collating of BGI enabled Google to obtain or deduce information relating not only to users' internet surfing habits and location, but also about such diverse factors as their interests and habits, race or ethnicity, social class, political or religious views or affiliations, age, health, gender, sexuality, and financial position.
[12]. Further, it is said that Google aggregated BGI from browsers displaying sufficiently similar patterns, creating groups with labels such as "football lovers", or "current affairs enthusiasts". Google's DoubleClick service then offered these groups to subscribing advertisers, allowing them to choose … the type of people that they wanted to direct their advertisements to".
Proceedings in the USA
The US Federal Trade Commission bought proceedings for misrepresenting to Safari users that it would not place tracking cookies on their browsers or send targeted advertising which Google settled by agreeing to pay a civil penalty of US$22.5 million.  It also settled an action by 37 states and the District of Columbia on behalf of their consumers by agreeing to pay US$17 million damages and giving certain undertakings.

Proceedings in the UK
 Mr Justice Warby had noted at paragraph [14] of his judgment that similar proceedings had not been brought in the UK by the Information Commissioner but he mentioned Vidal-Hall's claim that I discussed above. 

Applicable Law
Sir Geoffrey referred to paragraphs (2), (7), (8), (10), (11) and (55) of the recitals and arts 1, 22 and 23 of Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data ("the Directive").  He also referred to ss.1 (1) (a) and (b), 3, 4 (1) and (4), 13 (1) and (2) and 14 (4) of the Data Protection Act 1998. Finally, he referred to CPR 19.6 (1), (2), (3) and (4).

Was the judge right to hold that a claimant cannot recover uniform per capita damages for infringement of their data protection rights under section 13 without proving pecuniary loss or distress?
The Chancellor affirmed that s.13 of the Data Protection Act 1998 has to be construed in accordance with art 23 of the Directive which had been adopted to give effect to art 8 of the European Convention on Human Rights. He also noted that the parties had agreed that there was a de minimis threshold for an award of damages. After considering the Court of Appeal's decisions in Gulati and others v MGN Ltd [2015] WLR(D) 232, [2015] EWHC 1482 (Ch) which was a case on the misuse of personal information, Halliday v Creation Consumer Finance Ltd (CCF) [2013] EWCA Civ 333 (15 March 2013) which was on damages under s.13 of the Data Protection Act 1998 and other authorities, his lordship concluded at [70] "that damages are in principle capable of being awarded for loss of control of data under article 23 and section 13, even if there is no pecuniary loss and no distress."  He added that it was only by construing the legislation in this way that individuals can be provided with an effective remedy for the infringement of their rights under the Act.

Was the judge was right to hold that the members of the class did not have the same interest under CPR Part 19.6(1) and were not identifiable?
CPR 19.6 (1) provides:
"Where more than one person has the same interest in a claim –
(a) the claim may be begun; or
(b) the court may order that the claim be continued,
by or against one or more of the persons who have the same interest as representatives of any other persons who have that interest."
Mr Justice Warby had held that a representative claim was disqualified unless (a) "every member of the class [had] suffered the same damage (or their share of a readily ascertainable aggregate amount [was] clear)", and (b) different potential defences were not available in respect of claims by different members of the class.  In the present case, for example, some in the claimant class would have been heavy internet users with much BGI taken; it was not credible that all the specified categories of data were obtained by Google from each represented claimant. The same variations would apply if the user principle were applied. Neither the breach of duty nor the impact of it was uniform across the entire class membership.

Sir Geoffrey believed that Mr Justice Earby had applied too stringent a test of "same interest" partly because of his earlier finding on recoverable damages.  H observed at [75]:
"Once it is understood that the claimants that Mr Lloyd seeks to represent will all have had their BGI – something of value - taken by Google without their consent in the same circumstances during the same period, and are not seeking to rely on any personal circumstances affecting any individual claimant (whether distress or volume of data abstracted), the matter looks more straightforward. The represented class are all victims of the same alleged wrong, and have all sustained the same loss, namely loss of control over their BGI. Mr Tomlinson disavowed, as I have said, reliance on any facts affecting any individual represented claimant. That concession has the effect, of course, of reducing the damages that can be claimed to what may be described as the lowest common denominator. But it does not, I think, as the judge held, mean that the represented claimants do not have the same interest in the claim. Finally, in this connection, once the claim is understood in the way I have described, it is impossible to imagine that Google could raise any defence to one represented claimant that did not apply to all others. The wrong is the same, and the loss claimed is the same. The represented parties do, therefore, in the relevant sense have the same interest. Put in the more old-fashioned language of Lord Macnaghten in The Duke of Bedford at [8], the represented claimants have a 'common interest and a common grievance' and 'the relief sought [is] in its nature beneficial to all'".
Mr Justice Warby had also held that a class of claimants having the same interest could not be identified. The Chancellor disagreed.  He said at [81]:  Havi
"In my judgment, therefore, the judge ought to have held that the members of the represented class had the same interest under CPR Part 19.6(1) and that they were identifiable."
Can the judge's exercise of discretion be vitiated?
Having reached a different conclusion on the other two issues, the Chancellor considered that it was appropriate for the court to exercise its discretion afresh.  Having considered carefully all the factors raised by both sides he concluded that this was a claim which, as a matter of discretion, should be allowed to proceed.

Conclusion
The President of the Queen's Bench Divison and Lord Justice Davis agreed with the Chancellor's judgment.  The appeal was therefore allowed and permission was granted to the claimants to serve their claim on Google in the USA.

Anyone wishing to discuss this appeal pr data protection generally should call me on +44 (0)20 7404 5252 or send me a message through my contact form. 

Wednesday, 24 October 2018

The Morrisons Appeal - Vicarious Liability for Enployees' Breaches of Confidence and Statutory Duty

Royal Courts of Justice
Author Rafa Esteve
Licence Creative Commons Attribution Share Alike 4.0 International
Source Wikipedia



















Jane Lambert

Court of Appeal (Sir Terence Etherton MR and Lords Justices Bean and Flaux) Various Claimants v W M Morrison Supermarkets Plc  [2018] EWCA Civ 2339 (22 Oct 2018)

In  Various Claimants v WM Morrisons Supermarkets Plc (Rev 1) [2017] EWHC 3113 (QB), [2018] 3 WLR 691, Mr Justice Langstaff held that W M Morrisons Supermarket Plc ("Morrisons") was vicariously liable to its employees for the unauthorized act of one Skelton, an internal auditor, who had posted the names, addresses, gender, dates of birth, phone numbers (home or mobile), national insurance numbers, bank sort codes, bank account numbers and salaries of Morrisons' employees to a file sharing website. Skelton had acted as he did out of spite.  He had a grudge against Morrisons and wanted to injure the company.  The judge acknowledged at para [198] of his judgment that the effect of his judgment was to accomplish that injury and for that reason he gave the supermarket chain permission to appeal.  I commented on the case in Morrisons - Primary and Vicarious Liability for Breaches of Data Protection Act 1998 11 Dec 2017.

The defendant appealed on the following grounds:
"First, the Judge ought to have concluded that, on its proper interpretation and having regard to the nature and purposes of the statutory scheme, [the Data Protection Act 1998 ("the DPA")] excludes the application of vicarious liability. Second, the Judge ought to have concluded that, on its proper interpretation, the DPA excludes the application of causes of action for misuse of private information and breach of confidence and/or the imposition of vicarious liability for breaches of the same. Third, the Judge was wrong to conclude (a) that the wrongful acts of Mr Skelton occurred during the course of his employment by Morrisons, and, accordingly, (b) that Morrisons was vicariously liable for those wrongful acts."
By their respondents' notice, the claimants sought to uphold the judge's order on the additional ground "that, in evaluating whether there was a sufficient connection between Mr Skelton's employment and his wrongful conduct to make it right for Morrisons to be held vicariously liable, the Judge ought to have taken into account that Mr Skelton's job included the task or duty delegated to him by Morrisons of preserving confidentiality in the claimants' payroll information." The appeal came on before the Master of the Rolls and Lord Justices Bean and Flaux who heard the appeal on the 9 and 10 Oct and delivered judgment on 22 Oct 2018.

Their lorsdhips dismissed Morrisons' appeal.

As for the first and second grounds, the Court concluded at para [48] that it was clear that the vicarious liability of an employer for misuse of private information by an employee and for breach of confidence by an employee had not been excluded by the Data Protection Act 1998.  The applicable principle for determining that issue was whether, on the true construction of the statute in question,  Parliament had intended to exclude vicarious liability.  The appropriate test was:
"If the statutory code covers precisely the same ground as vicarious liability at common law, and the two are inconsistent with each other in one or more substantial respects, then the common law remedy will almost certainly have been excluded by necessary implication. As Lord Dyson said in the Child Poverty Action Group case (at [34]) the question is whether, looked at as a whole, the common law remedy would be incompatible with the statutory scheme and therefore could not have been intended to coexist with it."
Their lordships reasoned that if Parliament had intended to exclude that cause of action, it would have said so expressly. Secondly, Morrisons' counsel had conceded in her submissions that the Act had not excluded the action for breach of confidence or misuse of personal information.   Their lordships observed at [56]:
"Morrisons' acceptance that the causes of action at common law and in equity operate in parallel with the DPA in respect of the primary liability of the wrongdoer for the wrongful processing of personal data while at the same time contending that vicarious liability for the same causes of action has been excluded by the DPA is, on the face of it, a difficult line to tread."
They added at [57}:
"......  the difficulty of treading that line becomes insuperable on the facts of the present case because, as was emphasised by Mr Barnes [the claimants' counsel], the DPA says nothing at all about the liability of an employer, who is not a data controller, for breaches of the DPA by an employee who is a data controller."
The concession that the causes of action for misuse of private information and breach of confidence are not excluded by the Act in respect of the wrongful processing of data within the ambit of the statute, and the complete absence of any provision addressing the situation of an employer where an employee data controller breaches the requirements of the Act, led inevitably to the conclusion that the Mr Justice Langstaff was correct to hold that the common law remedy of vicarious liability of the employer was not expressly or impliedly excluded by the Act.

In respect of the third ground of appeal, the Court referred to the judgment of Lord Toulson in Mohamud v WM Morrison Supermarkets Plc   [2016] UKSC 11, [2016] IRLR 362, [2016] ICR 485, [2016] 2 WLR 821, [2017] 1 All ER 15, [2016] AC 677, [2016] PIQR P11, [2016] WLR(D) 109.  At para [44] Lord Toulson had asked "what functions or "field of activities" have been entrusted by the employer to the employee, or, in everyday language, what was the nature of his job?"  Next "the court must decide whether there was sufficient connection between the position in which he was employed and his wrongful conduct to make it right for the employer to be held liable under the principle of social justice which goes back to Holt CJ."  As to Lord Toulson's first question, the Court of Appeal endorsed the trial judge's finding that Morrisons had entrusted Skelton with payroll data. It was part of his job to disclose it to a third party.  He had clearly exceeded his authority but that did not matter because his wrongdoing was nonetheless closely related to the task that he had to do.  As to the second part of Lord Toulson's test. the Court endorsed the Mr Justice Langstaff's finding that there was an unbroken thread that linked his work to the disclosure,

As noted above, the trial judge had been troubled by the thought that the court was facilitating Skelton's wrongdoing.  The Court of Appeal noted at para [75] that it had not been shown any  reported case in which the motive of the employee committing the wrongdoing was to harm his employer rather than to achieve some benefit for himself or to inflict injury on a third party.  Morrisons submitted that it would be wrong to impose vicarious liability on an employer in circumstances such as this especially as there were so many potential claimants.   Their lordships had no trouble in rejecting those submissions.  Motive was irrelevant and to have held otherwise would have left thousands of hapless data subjects without remedy.

In Mohamud, Lord Toulson had remarked at paea [40] of his judgment that:
"The risk of an employee misusing his position is one of life's unavoidable facts."
The solution for employers was to insure against liability for the misdeeds of their staff.   As the Master of the Rolls put it at [78]:
"There have been many instances reported in the media in recent years of data breaches on a massive scale caused by either corporate system failures or negligence by individuals acting in the course of their employment. These might, depending on the facts, lead to a large number of claims against the relevant company for potentially ruinous amounts. The solution is to insure against such catastrophes; and employers can likewise insure against losses caused by dishonest or malicious employees. We have not been told what the insurance position is in the present case, and of course it cannot affect the result. The fact of a defendant being insured is not a reason for imposing liability, but the availability of insurance is a valid answer to the Doomsday or Armageddon arguments put forward by Ms Proops on behalf of Morrisons."
 That last paragraph will be one of the reasons why this case will appear in countless skeleton arguments and law reports in the future.  The other is the Court's analysis of the circumstances when a statutory code displaces common law remedies.

Anyone wishing to discuss this case or data protection generally should call me on 020 7404 5252 or send me a message through my contact form.

Sunday, 14 October 2018

Privacy Sandbox

Author Hyena
Reproduced with kind permission of the author
Source Wikipedia





















Jane Lambert

The Information Commissioner's Office has just carried out a consultation on creating a regulatory sandbox "to develop innovative products and services using personal data in innovative ways" (see ICO call for views on creating a regulatory sandbox on the ICO website).  The idea of a sandbox was pioneered by the Financial Conduct Authority which described it as  "a ‘safe space’ in which businesses can test innovative products, services, business models and delivery mechanisms without immediately incurring all the normal regulatory consequences of engaging in the activity in question" (see FCA Regulatory Sandbox Nov 2015). 

The FCA's idea of a sandbox for new products, services and business models proved not only feasible but popular and has been imitated by other financial services regulators around the world.  The Information Commissioner announced her intention of extending the idea to data protection in her Information Rights Strategic Plan 2017 - 2021:
"Technology goal #8: To engage with organisations in a safe and controlled environment to understand and explore innovative technology. 
  • We will establish a ‘regulatory sandbox’, drawing on the successful sandbox process that the Financial Conduct Authority has developed. The ICO sandbox will enable organisations to develop innovative digital products and services, whilst engaging with the regulator, ensuring that appropriate protections and safeguards are in place. As part of the sandbox process the ICO would provide advice on mitigating risks and data protection by design. 
  • In 2018 we will consult and engage with organisations about implementation of a sandbox."
The consultation closed on Friday but the Call for Evidence makes clear that that was only the first stage of the consultation process. There will be a more detailed proposal for consultation later in the year.

In his blog post Your views will help us build our regulatory sandbox, Chris Taylor, Head of Assurance at the ICO, set out the topics upon which he wants to hear from the public:
  • "what you think the scope of any such sandbox should be - should we focus on particular innovations, sectors or types of organisations?
  • what you think the benefits might be to working in a sandbox, whether that’s our expert input or increased reassurance for your customers or clients.
  • what mechanisms you might find most helpful in a sandbox – from adaptations to our approach, to informal steers or the provision of technical guidance – what are the tools that a sandbox might contain?
  • at what stage in the design and development process a sandbox would be most useful to you?"
Mr Taylor also made a point that applies to innovation generally and not just to data protection law.  It is often said in the USA and in some quarters in this country that red tape (which is a derogatory term for regulation) hobbles innovation and enterprise.   If that were so the USA would be the most innovative nation on earth but a glance of the Global Innovation Index  shows that it lies behind four European nations including the UK. 

The author explains that 
"privacy and innovation go hand in hand. It’s not privacy or innovation, it’s privacy and innovation – because organisations that use our sandbox won’t be exempt from data protection law."
A regulatory sandbox enables regulators to anticipate and make provision for difficulties before they arise thus rescuing sparing new technologies and businesses from the legal quagmires that dogged earlier technologies.  In those days the law reacted to new technologies often imperfectly. 

The proposed sandbox should mitigate the privacy uncertainties affecting new products, services and business models but they won't remove all. There will remain other issues such as patenting or other IP protection, licensing, competition and so firth.  I am well placed and should be glad to help fintech and other entrepreneurs or the patent and trade mark attorneys, solicitors, accountants and other professional advisers who may assist them.  Should any of them wish to discuss this article or data protection generally, they are welcome to call me on +44 (0)20 7404 5252 during office houses or send me a message through my contact form.   

Thursday, 11 October 2018

Data Protection after Brexit

Author Furfur
Licence Creative Commons Attribution Share Alike 4.0 International


























Jane Lambert

Because of the importance of its financial services industry, preserving an uninterrupted flow of personal data across national frontiers is particularly important to the UK.  At present, such flow is guaranteed by art 1 (3) of the General Data Protection Regulation (Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC ("the GDPR").  When the UK leaves the EU, the GDPR will cease to apply to  this country and the UK shall become a third country for the purposes of Chapter V of the GDPR.

Our exit from the UK will have no effect on the obligations of data controllers and processors in the UK or the rights of data subjects anywhere with regard to UK controllers and processors because s.3 (1) of the European Union (Withdrawal) Act 2018 will incorporate the GDPR into our law.  The Department for Digital, Culture, Media and Sport has confirmed in its guidance note Data Protection if there's no Brexit deal of 13 Sept 2018 that
"[i]n recognition of the unprecedented degree of alignment between the UK and EU’s data protection regimes, the UK would at the point of exit continue to allow the free flow of personal data from the UK to the EU"
though it adds that  the UK would keep this under review.  On the other hand, art 44 of the GDPR makes clear that data controllers and processors in the states that remain in the EU would be able to transmit personal data to the UK only in accordance with the provisions of Chapter V of the regulation.

Art 45 (1) of the GDPR provides:
"A transfer of personal data to a third country or an international organisation may take place where the Commission has decided that the third country, a territory or one or more specified sectors within that third country, or the international organisation in question ensures an adequate level of protection."
The  Department for Digital, Culture, Media and Sport's guidance note notes that the "European Commission has stated that if it deems the UK’s level of personal data protection essentially equivalent to that of the EU, it would make an adequacy decision allowing the transfer of personal data to the UK without restrictions." However it adds that while HM government wants to begin preliminary discussions on an adequacy assessment now, the Commission has stated that a decision on adequacy cannot be taken until the UK is a third country. 

Unless and until the Commission makes an adequacy assessment businesses in the UK must rely on one of the other provisions of Chapter V of the GDPR.  The guidance note suggests:
"For the majority of organisations the most relevant alternative legal basis would be standard contractual clauses. These are model data protection clauses that have been approved by the European Commission and enable the free flow of personal data when embedded in a contract. The clauses contain contractual obligations on you and your EU partner, and rights for the individuals whose personal data is transferred. In certain circumstances, your EU partners may alternatively be able to rely on a derogation to transfer personal data."
It recommends businesses proactively to consider what action they may need to take to ensure the continued free flow of data with EU partners.

If the British government and EU reach a withdrawal agreement in time for ratification before the 29 March 2019 there will be an implementation period in which the GDPR will continue to apply to the UK until 31 Dec 2020.  What happens after that will depend on the terms of the agreement on the future relationship between the EU and the UK.  At para 3.2.1 (8) of its while paper The Future Relationship between the United Kingdom and the European Union (Cm 9503) the government says:
"The UK believes that the EU’s adequacy framework provides the right starting point for the arrangements the UK and the EU should agree on data protection but wants to go beyond the framework in two key respects:
a. on stability and transparency, it would benefit the UK and the EU, as well as businesses and individuals, to have a clear, transparent framework to facilitate dialogue, minimise the risk of disruption to data flows and support a stable relationship between the UK and the EU to protect the personal data of UK and EU citizens across Europe; and
b. on regulatory cooperation, it would be in the UK’s and the EU's mutual interest to have close cooperation and joined up enforcement action between the UK's Information Commissioner's Office (ICO) and EU Data Protection Authorities."
It is still not clear whether the EU will agree to the white paper proposal or even whether there will be a withdrawal agreement that will allow a transitional period,

Anyone wishing to discuss this article or data protection generally should call me on 020 7404 5252 during office hours or send me a message through my contact form.

Monday, 25 June 2018

What is meant by the "Applied GDPR"

Jane Lambert











The term the "applied GDPR" is defined by s.3 (11) of the Data Protection Act 2018 as  the GDPR as applied by Chapter 3 of Part 2 of the Act.  According to s.4 (3) Chapter 3 applies to certain types of processing of personal data to which the GDPR does not apply and makes provision for a regime broadly equivalent to the GDPR to apply to such processing.   S.22 (1) of the Act provides that the  GDPR applies to the processing of personal data to which Chapter 3 applies as if its articles were part of an Act of Parliament.

Processing to which Chapter 3 applies
S.21 provides that Chapter 3 applies to:

  • automated or structured processing of personal data in the course of an activity that:
    • falls outside the scope of EU law; or
    • is carried out by a member state in relation to the EU's common foreign and security policy but does not fall within law enforcement as that is covered by Part 3 or processing by intelligence services which is covered by Part 4 (s.21 (1)); and
  • manual unstructured processing of personal data held by certain public authorities (s.21 (2)).
S.22 (1) extends the GDPR to the processing of personal data to which Chapter 3 applies as if the GDPR's articles were part of an Act of Parliament for the whole UK.   The explanatory note explains that Chapter 3 applies to manual unstructured processing of personal data held by certain public authorities because such processing was regulated by the Data Protection Act 1998 but not by the GDPR. The public authorities concerned are defined by s.21 (5) as public authorities as defined by the Freedom of Information Act 2000 or Scottish public authorities as defined by the Freedom of Information (Scotland) Act 2002.

Modifications to the GDPR
The GDPR that apply to the processing to which Chapter 3 applies are modified by Part I of Sched. 6 to the Act. That part consists of 72 paragraphs most of which modify articles of the GDPR. For instance, art 2 of the GDPR provides:

"Material scope
1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system.
2. This Regulation does not apply to the processing of personal data:
(a) in the course of an activity which falls outside the scope of Union law;
(b) by the Member States when carrying out activities which fall within the scope of Chapter 2 of Title V of the TEU;
(c) by a natural person in the course of a purely personal or household activity;
(d) by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security.
3. For the processing of personal data by the Union institutions, bodies, offices and agencies, Regulation (EC) No 45/2001 applies. Regulation (EC) No 45/2001 and other Union legal acts applicable to such processing of personal data shall be adapted to the principles and rules of this Regulation in accordance with Article 98.
4. This Regulation shall be without prejudice to the application of Directive 2000/31/EC, in particular of the liability rules of intermediary service providers in Articles 12 to 15 of that Directive."
Para 7 substitutes the following provision for art 2 of the GDPR in relation to the processing to which Chapter 3 applies:
“2  This Regulation applies to the processing of personal data to which Chapter 3 of Part 2 of the 2018 Act applies (see section 21 of that Act).”
Supplementary Provisions
As I noted in The Relationship between the Data Protection Act 2018 and the GDPR 20 June 2018, S.4 (2) of the Act provides for Chapter 2 of Part 2 to applies to the types of processing of personal data to which the GDPR applies by virtue of art 2 of the GDPR.  I discussed the provisions of Chapter 2 in my article.  Chapter 2 also applies to the applied GDPR as it applies to the GDPR by virtue of s.22 (2) but Part 2 of Sched. 6 modifies Chapter 2 of Part 2 in respect of those applied GDPR pursuant to s.22 (4) (b).

Interpretation of the Applied GDPR
S.22 (5) of the Act provides:
"A question as to the meaning or effect of a provision of the applied GDPR, or the applied Chapter 2 , is to be determined consistently with the interpretation of the equivalent provision of the GDPR, or Chapter 2 of this Part, as it applies otherwise than by virtue of this Chapter, except so far as Schedule 6 requires a different interpretation."
Rule Making Powers
S.23 (1) enables the Secretary of State to make regulations in relation to the processing of personal data to which Chapter 3 applies.

Manual Unstructured Data
S.24 makes certain modifications to the applied GDPR in relation to unstructured data held by public authorities as defined by the Freedom of Information Act 2000 or Scottish public authorities as defined by the Freedom of Information (Scotland) Act 2002.

Exemptions
Exemptions are made for manual unstructured data used in longstanding historical research by virtue of s.25, and national security and defence pursuant to  s.26, s.27 and s.28.

Further Information
Anyone wishing to discuss this article or data protection generally should call me during office hours on +44 (0)20 7404 5252 or send me a message through my contact form.

Wednesday, 20 June 2018

The Relationship between the Data Protection Act 2018 and the GDPR

Jane Lambert











As I mentioned on the index page for the Data Protection Act 2018s.1 (1) of the Act  states that the Act makes provision about the processing of personal data.  As everyone knows, most processing of personal data is subject to the GDPR but the GDPR makes many references to national law.  Even though the GDPR is directly applicable in the laws of each of the member states by virtue of  art 288 of the Treaty on the Functioning of the European Union, the GDPR needs to be supplemented by national legislation to function effectively.  That is why s.1 (3) provides that Part 2 of the Act supplements the GDPR.

The Legislative Scheme
S.1 (1) and (2) are amplified by s.2 (1) which provides:
"The GDPR, the applied GDPR and this Act protect individuals with regard to the processing of personal data, in particular by—
(a) requiring personal data to be processed lawfully and fairly, on the basis of the data subject’s consent or another specified basis,
(b) conferring rights on the data subject to obtain information about the processing of personal data and to require inaccurate personal data to be rectified, and
(c) conferring functions on the Commissioner, giving the holder of that office responsibility for monitoring and enforcing their provisions."
S.4 (2) adds that Chapter 2 of Part 2 applies to the types of processing of personal data to which the GDPR applies by virtue of art 2 and that that Chapter supplements, and must be read with, the GDPR.

Understanding the Scheme
Probably the best way to understand the scheme is to take an example. 

Art 5 of the GDPR  sets out a number of principles for the processing of personal data.  The first of those principles is that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject.  Art 6 (1) stipulates that processing shall be lawful only if and to the extent that one or more specified circumstances apply. One of those circumstances is that processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (point "e").

What constitutes the public interest and official authority are matters for the legislatures of the member states.   S.8 of the Data Protection Act 2018 provides:
"In Article 6 (1) of the GDPR (lawfulness of processing), the reference in point (e) to processing of personal data that is necessary for the performance of a task carried out in the public interest or in the exercise of the controller’s official authority includes processing of personal data that is necessary for—
(a) the administration of justice,
(b) the exercise of a function of either House of Parliament,
(c) the exercise of a function conferred on a person by an enactment or rule of law,
(d) the exercise of a function of the Crown, a Minister of the Crown or a government department, or
(e) an activity that supports or promotes democratic engagement."
There are similar supplementary provisions on such matters as children's consent, special categories of personal data, powers to make regulations on the fees that can be charged by data controllers in exceptional circumstances, exemptions and transfers abroad.

Further Information
Should anyone wish to discuss this article or data protection generally, he or she should call me on 020 7404 5252 during office hours or send me a message through my contact form.

Monday, 11 June 2018

The Data Protection Act 2018 - repealing the 1998 Act and applying the GDPR

Jane Lambert


As everyone knows, the GDPR (Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) repealed and replaced the Data Protection Directive (Directive 94/46/EC) with effect from 25 May 2018.

Even though it repealed the Directive which was implemented into English and Welsh, Scottish and Northern Irish law by the Data Protection Act 1998, the GDPR did not automatically repeal the 1998 Act although the doctrine of the primacy of EU law recognized by the House of Lords in R (Factortame Ltd) v Secretary of State for Transport (No 2) [1991] [1990] UKHL 13, [1991] 1 Lloyd's Rep 10, [1991] 1 AC 603, [1991] 1 All ER 70, [1990] 3 WLR 818, [1991] AC 603, (1991) 3 Admin LR 333, [1990] 3 CMLR 375 would have had that practical effect.

For the avoidance of any doubt, Parliament passed the Data Protection Act 2018 which received royal assent on 23 May 2018, which was two days before the General Data Protection Regulation ("GDPR") was due to take effect.  The introductory text describes the newAct as:
"An Act to make provision for the regulation of the processing of information relating to individuals; to make provision in connection with the Information Commissioner’s functions under certain regulations relating to information; to make provision for a direct marketing code of practice; and for connected purposes."
It consists of 215 sections and 20 schedules.  It is intended to supplement the GDPR and implement the Data Protection Law Enforcement Directive (Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA).

Because the Act received royal assent only days before the GDPR was due to come into effect, the following provisions came into effect immediately:
The very next day, Margot James MP, Minister for State at the Department for Digital, Culture, Media and Sport, signed The Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018 SI 2018 No 625. Reg 2 (1) of those Regulations brought the following provisions of the Data Protection Act 2018 into effect from 25 May 2018:
It will be seen that most of the Act is already in force and the few provisions that are not will come into force on 23 July 2018.

The provisions that repeal most of the 1998 Act are s.211 (1) (a) and para 44 of Sched. 19 of the Data Protection Act 2018.  S.111 (1) (a) provides:
"In Schedule 19—
(a)  Part 1 contains minor and consequential amendments of primary legislation ..."
Para 44 of Sched. 19 adds:
"The Data Protection Act 1998 is repealed, with the exception of section 62 and paragraphs 13, 15, 16, 18 and 19 of Schedule 15 (which amend other enactments)."
There are of course transitional and provisional measures that I shall address when occasion demands.

Anyone wishing to discuss this article, GDPR or data protection generally may call me on +44 (0)20 7404 5252 during office hours or send me a message through my contact form.