Sunday, 14 October 2018

Privacy Sandbox

Author Hyena
Reproduced with kind permission of the author
Source Wikipedia





















Jane Lambert

The Information Commissioner's Office has just carried out a consultation on creating a regulatory sandbox "to develop innovative products and services using personal data in innovative ways" (see ICO call for views on creating a regulatory sandbox on the ICO website).  The idea of a sandbox was pioneered by the Financial Conduct Authority which described it as  "a ‘safe space’ in which businesses can test innovative products, services, business models and delivery mechanisms without immediately incurring all the normal regulatory consequences of engaging in the activity in question" (see FCA Regulatory Sandbox Nov 2015). 

The FCA's idea of a sandbox for new products, services and business models proved not only feasible but popular and has been imitated by other financial services regulators around the world.  The Information Commissioner announced her intention of extending the idea to data protection in her Information Rights Strategic Plan 2017 - 2021:
"Technology goal #8: To engage with organisations in a safe and controlled environment to understand and explore innovative technology. 
  • We will establish a ‘regulatory sandbox’, drawing on the successful sandbox process that the Financial Conduct Authority has developed. The ICO sandbox will enable organisations to develop innovative digital products and services, whilst engaging with the regulator, ensuring that appropriate protections and safeguards are in place. As part of the sandbox process the ICO would provide advice on mitigating risks and data protection by design. 
  • In 2018 we will consult and engage with organisations about implementation of a sandbox."
The consultation closed on Friday but the Call for Evidence makes clear that that was only the first stage of the consultation process. There will be a more detailed proposal for consultation later in the year.

In his blog post Your views will help us build our regulatory sandbox, Chris Taylor, Head of Assurance at the ICO, set out the topics upon which he wants to hear from the public:
  • "what you think the scope of any such sandbox should be - should we focus on particular innovations, sectors or types of organisations?
  • what you think the benefits might be to working in a sandbox, whether that’s our expert input or increased reassurance for your customers or clients.
  • what mechanisms you might find most helpful in a sandbox – from adaptations to our approach, to informal steers or the provision of technical guidance – what are the tools that a sandbox might contain?
  • at what stage in the design and development process a sandbox would be most useful to you?"
Mr Taylor also made a point that applies to innovation generally and not just to data protection law.  It is often said in the USA and in some quarters in this country that red tape (which is a derogatory term for regulation) hobbles innovation and enterprise.   If that were so the USA would be the most innovative nation on earth but a glance of the Global Innovation Index  shows that it lies behind four European nations including the UK. 

The author explains that 
"privacy and innovation go hand in hand. It’s not privacy or innovation, it’s privacy and innovation – because organisations that use our sandbox won’t be exempt from data protection law."
A regulatory sandbox enables regulators to anticipate and make provision for difficulties before they arise thus rescuing sparing new technologies and businesses from the legal quagmires that dogged earlier technologies.  In those days the law reacted to new technologies often imperfectly. 

The proposed sandbox should mitigate the privacy uncertainties affecting new products, services and business models but they won't remove all. There will remain other issues such as patenting or other IP protection, licensing, competition and so firth.  I am well placed and should be glad to help fintech and other entrepreneurs or the patent and trade mark attorneys, solicitors, accountants and other professional advisers who may assist them.  Should any of them wish to discuss this article or data protection generally, they are welcome to call me on +44 (0)20 7404 5252 during office houses or send me a message through my contact form.   

Thursday, 11 October 2018

Data Protection after Brexit

Author Furfur
Licence Creative Commons Attribution Share Alike 4.0 International


























Jane Lambert

Because of the importance of its financial services industry, preserving an uninterrupted flow of personal data across national frontiers is particularly important to the UK.  At present, such flow is guaranteed by art 1 (3) of the General Data Protection Regulation (Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC ("the GDPR").  When the UK leaves the EU, the GDPR will cease to apply to  this country and the UK shall become a third country for the purposes of Chapter V of the GDPR.

Our exit from the UK will have no effect on the obligations of data controllers and processors in the UK or the rights of data subjects anywhere with regard to UK controllers and processors because s.3 (1) of the European Union (Withdrawal) Act 2018 will incorporate the GDPR into our law.  The Department for Digital, Culture, Media and Sport has confirmed in its guidance note Data Protection if there's no Brexit deal of 13 Sept 2018 that
"[i]n recognition of the unprecedented degree of alignment between the UK and EU’s data protection regimes, the UK would at the point of exit continue to allow the free flow of personal data from the UK to the EU"
though it adds that  the UK would keep this under review.  On the other hand, art 44 of the GDPR makes clear that data controllers and processors in the states that remain in the EU would be able to transmit personal data to the UK only in accordance with the provisions of Chapter V of the regulation.

Art 45 (1) of the GDPR provides:
"A transfer of personal data to a third country or an international organisation may take place where the Commission has decided that the third country, a territory or one or more specified sectors within that third country, or the international organisation in question ensures an adequate level of protection."
The  Department for Digital, Culture, Media and Sport's guidance note notes that the "European Commission has stated that if it deems the UK’s level of personal data protection essentially equivalent to that of the EU, it would make an adequacy decision allowing the transfer of personal data to the UK without restrictions." However it adds that while HM government wants to begin preliminary discussions on an adequacy assessment now, the Commission has stated that a decision on adequacy cannot be taken until the UK is a third country. 

Unless and until the Commission makes an adequacy assessment businesses in the UK must rely on one of the other provisions of Chapter V of the GDPR.  The guidance note suggests:
"For the majority of organisations the most relevant alternative legal basis would be standard contractual clauses. These are model data protection clauses that have been approved by the European Commission and enable the free flow of personal data when embedded in a contract. The clauses contain contractual obligations on you and your EU partner, and rights for the individuals whose personal data is transferred. In certain circumstances, your EU partners may alternatively be able to rely on a derogation to transfer personal data."
It recommends businesses proactively to consider what action they may need to take to ensure the continued free flow of data with EU partners.

If the British government and EU reach a withdrawal agreement in time for ratification before the 29 March 2019 there will be an implementation period in which the GDPR will continue to apply to the UK until 31 Dec 2020.  What happens after that will depend on the terms of the agreement on the future relationship between the EU and the UK.  At para 3.2.1 (8) of its while paper The Future Relationship between the United Kingdom and the European Union (Cm 9503) the government says:
"The UK believes that the EU’s adequacy framework provides the right starting point for the arrangements the UK and the EU should agree on data protection but wants to go beyond the framework in two key respects:
a. on stability and transparency, it would benefit the UK and the EU, as well as businesses and individuals, to have a clear, transparent framework to facilitate dialogue, minimise the risk of disruption to data flows and support a stable relationship between the UK and the EU to protect the personal data of UK and EU citizens across Europe; and
b. on regulatory cooperation, it would be in the UK’s and the EU's mutual interest to have close cooperation and joined up enforcement action between the UK's Information Commissioner's Office (ICO) and EU Data Protection Authorities."
It is still not clear whether the EU will agree to the white paper proposal or even whether there will be a withdrawal agreement that will allow a transitional period,

Anyone wishing to discuss this article or data protection generally should call me on 020 7404 5252 during office hours or send me a message through my contact form.

Monday, 25 June 2018

What is meant by the "Applied GDPR"

Jane Lambert











The term the "applied GDPR" is defined by s.3 (11) of the Data Protection Act 2018 as  the GDPR as applied by Chapter 3 of Part 2 of the Act.  According to s.4 (3) Chapter 3 applies to certain types of processing of personal data to which the GDPR does not apply and makes provision for a regime broadly equivalent to the GDPR to apply to such processing.   S.22 (1) of the Act provides that the  GDPR applies to the processing of personal data to which Chapter 3 applies as if its articles were part of an Act of Parliament.

Processing to which Chapter 3 applies
S.21 provides that Chapter 3 applies to:

  • automated or structured processing of personal data in the course of an activity that:
    • falls outside the scope of EU law; or
    • is carried out by a member state in relation to the EU's common foreign and security policy but does not fall within law enforcement as that is covered by Part 3 or processing by intelligence services which is covered by Part 4 (s.21 (1)); and
  • manual unstructured processing of personal data held by certain public authorities (s.21 (2)).
S.22 (1) extends the GDPR to the processing of personal data to which Chapter 3 applies as if the GDPR's articles were part of an Act of Parliament for the whole UK.   The explanatory note explains that Chapter 3 applies to manual unstructured processing of personal data held by certain public authorities because such processing was regulated by the Data Protection Act 1998 but not by the GDPR. The public authorities concerned are defined by s.21 (5) as public authorities as defined by the Freedom of Information Act 2000 or Scottish public authorities as defined by the Freedom of Information (Scotland) Act 2002.

Modifications to the GDPR
The GDPR that apply to the processing to which Chapter 3 applies are modified by Part I of Sched. 6 to the Act. That part consists of 72 paragraphs most of which modify articles of the GDPR. For instance, art 2 of the GDPR provides:

"Material scope
1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system.
2. This Regulation does not apply to the processing of personal data:
(a) in the course of an activity which falls outside the scope of Union law;
(b) by the Member States when carrying out activities which fall within the scope of Chapter 2 of Title V of the TEU;
(c) by a natural person in the course of a purely personal or household activity;
(d) by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security.
3. For the processing of personal data by the Union institutions, bodies, offices and agencies, Regulation (EC) No 45/2001 applies. Regulation (EC) No 45/2001 and other Union legal acts applicable to such processing of personal data shall be adapted to the principles and rules of this Regulation in accordance with Article 98.
4. This Regulation shall be without prejudice to the application of Directive 2000/31/EC, in particular of the liability rules of intermediary service providers in Articles 12 to 15 of that Directive."
Para 7 substitutes the following provision for art 2 of the GDPR in relation to the processing to which Chapter 3 applies:
“2  This Regulation applies to the processing of personal data to which Chapter 3 of Part 2 of the 2018 Act applies (see section 21 of that Act).”
Supplementary Provisions
As I noted in The Relationship between the Data Protection Act 2018 and the GDPR 20 June 2018, S.4 (2) of the Act provides for Chapter 2 of Part 2 to applies to the types of processing of personal data to which the GDPR applies by virtue of art 2 of the GDPR.  I discussed the provisions of Chapter 2 in my article.  Chapter 2 also applies to the applied GDPR as it applies to the GDPR by virtue of s.22 (2) but Part 2 of Sched. 6 modifies Chapter 2 of Part 2 in respect of those applied GDPR pursuant to s.22 (4) (b).

Interpretation of the Applied GDPR
S.22 (5) of the Act provides:
"A question as to the meaning or effect of a provision of the applied GDPR, or the applied Chapter 2 , is to be determined consistently with the interpretation of the equivalent provision of the GDPR, or Chapter 2 of this Part, as it applies otherwise than by virtue of this Chapter, except so far as Schedule 6 requires a different interpretation."
Rule Making Powers
S.23 (1) enables the Secretary of State to make regulations in relation to the processing of personal data to which Chapter 3 applies.

Manual Unstructured Data
S.24 makes certain modifications to the applied GDPR in relation to unstructured data held by public authorities as defined by the Freedom of Information Act 2000 or Scottish public authorities as defined by the Freedom of Information (Scotland) Act 2002.

Exemptions
Exemptions are made for manual unstructured data used in longstanding historical research by virtue of s.25, and national security and defence pursuant to  s.26, s.27 and s.28.

Further Information
Anyone wishing to discuss this article or data protection generally should call me during office hours on +44 (0)20 7404 5252 or send me a message through my contact form.

Wednesday, 20 June 2018

The Relationship between the Data Protection Act 2018 and the GDPR

Jane Lambert











As I mentioned on the index page for the Data Protection Act 2018s.1 (1) of the Act  states that the Act makes provision about the processing of personal data.  As everyone knows, most processing of personal data is subject to the GDPR but the GDPR makes many references to national law.  Even though the GDPR is directly applicable in the laws of each of the member states by virtue of  art 288 of the Treaty on the Functioning of the European Union, the GDPR needs to be supplemented by national legislation to function effectively.  That is why s.1 (3) provides that Part 2 of the Act supplements the GDPR.

The Legislative Scheme
S.1 (1) and (2) are amplified by s.2 (1) which provides:
"The GDPR, the applied GDPR and this Act protect individuals with regard to the processing of personal data, in particular by—
(a) requiring personal data to be processed lawfully and fairly, on the basis of the data subject’s consent or another specified basis,
(b) conferring rights on the data subject to obtain information about the processing of personal data and to require inaccurate personal data to be rectified, and
(c) conferring functions on the Commissioner, giving the holder of that office responsibility for monitoring and enforcing their provisions."
S.4 (2) adds that Chapter 2 of Part 2 applies to the types of processing of personal data to which the GDPR applies by virtue of art 2 and that that Chapter supplements, and must be read with, the GDPR.

Understanding the Scheme
Probably the best way to understand the scheme is to take an example. 

Art 5 of the GDPR  sets out a number of principles for the processing of personal data.  The first of those principles is that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject.  Art 6 (1) stipulates that processing shall be lawful only if and to the extent that one or more specified circumstances apply. One of those circumstances is that processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (point "e").

What constitutes the public interest and official authority are matters for the legislatures of the member states.   S.8 of the Data Protection Act 2018 provides:
"In Article 6 (1) of the GDPR (lawfulness of processing), the reference in point (e) to processing of personal data that is necessary for the performance of a task carried out in the public interest or in the exercise of the controller’s official authority includes processing of personal data that is necessary for—
(a) the administration of justice,
(b) the exercise of a function of either House of Parliament,
(c) the exercise of a function conferred on a person by an enactment or rule of law,
(d) the exercise of a function of the Crown, a Minister of the Crown or a government department, or
(e) an activity that supports or promotes democratic engagement."
There are similar supplementary provisions on such matters as children's consent, special categories of personal data, powers to make regulations on the fees that can be charged by data controllers in exceptional circumstances, exemptions and transfers abroad.

Further Information
Should anyone wish to discuss this article or data protection generally, he or she should call me on 020 7404 5252 during office hours or send me a message through my contact form.

Monday, 11 June 2018

The Data Protection Act 2018 - repealing the 1998 Act and applying the GDPR

Jane Lambert


As everyone knows, the GDPR (Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) repealed and replaced the Data Protection Directive (Directive 94/46/EC) with effect from 25 May 2018.

Even though it repealed the Directive which was implemented into English and Welsh, Scottish and Northern Irish law by the Data Protection Act 1998, the GDPR did not automatically repeal the 1998 Act although the doctrine of the primacy of EU law recognized by the House of Lords in R (Factortame Ltd) v Secretary of State for Transport (No 2) [1991] [1990] UKHL 13, [1991] 1 Lloyd's Rep 10, [1991] 1 AC 603, [1991] 1 All ER 70, [1990] 3 WLR 818, [1991] AC 603, (1991) 3 Admin LR 333, [1990] 3 CMLR 375 would have had that practical effect.

For the avoidance of any doubt, Parliament passed the Data Protection Act 2018 which received royal assent on 23 May 2018, which was two days before the General Data Protection Regulation ("GDPR") was due to take effect.  The introductory text describes the newAct as:
"An Act to make provision for the regulation of the processing of information relating to individuals; to make provision in connection with the Information Commissioner’s functions under certain regulations relating to information; to make provision for a direct marketing code of practice; and for connected purposes."
It consists of 215 sections and 20 schedules.  It is intended to supplement the GDPR and implement the Data Protection Law Enforcement Directive (Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA).

Because the Act received royal assent only days before the GDPR was due to come into effect, the following provisions came into effect immediately:
The very next day, Margot James MP, Minister for State at the Department for Digital, Culture, Media and Sport, signed The Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018 SI 2018 No 625. Reg 2 (1) of those Regulations brought the following provisions of the Data Protection Act 2018 into effect from 25 May 2018:
It will be seen that most of the Act is already in force and the few provisions that are not will come into force on 23 July 2018.

The provisions that repeal most of the 1998 Act are s.211 (1) (a) and para 44 of Sched. 19 of the Data Protection Act 2018.  S.111 (1) (a) provides:
"In Schedule 19—
(a)  Part 1 contains minor and consequential amendments of primary legislation ..."
Para 44 of Sched. 19 adds:
"The Data Protection Act 1998 is repealed, with the exception of section 62 and paragraphs 13, 15, 16, 18 and 19 of Schedule 15 (which amend other enactments)."
There are of course transitional and provisional measures that I shall address when occasion demands.

Anyone wishing to discuss this article, GDPR or data protection generally may call me on +44 (0)20 7404 5252 during office hours or send me a message through my contact form.

Friday, 20 April 2018

Right to be Forgotten - NT1 and another v Google

Royal Courts of Justice
Author Rafa Esteve
Licence Creative Commons Attribution Share Alike 4.0 International
Source Wikipedia

















Jane Lambert

Queen's Bench Division (Mr Justice Warby) NT 1 and NT 2 v Google LLC [2018] EWHC 799 (QB) (13 Apr 2018)

These were claims by two businessmen known respectively as NT1 and NT2 against Google LLC.  Both claimants had been  convicted of criminal offences but, as their convictions had been spent, they sought orders requiring details of their offending, convictions and sentences to be removed from Google's search results on the grounds that the information was inaccurate, old, out of date, irrelevant, of no public interest, and/or otherwise an illegitimate interference with their human rights. The claimants also sought compensation from Google for continuing to return search results disclosing such details, after those complaints had been brought to its notice. The actiod to block one link on 7 Oct 2014 but declined to block any of the others. NT1 asked Google to reconsider his request, but Google stood by its position. On 26 Jan 2015, NT1's solicitors wrote to Google requiring it to cease processing links to 2 media reports. In April 2015, Google refused. NT1 brought these proceedings on 2 Oct 2015 seeking orders for the blocking and/or erasure of links to the 2 media reports, an injunction to prevent Google from continuing to return such links, and financial compensation. In December 2017, NT1 expanded his claim to cover a third link, relating to a book extract covering the same subject-matter, in similar terms.

NT2's Claim
His lordship summarized NT2's claim at para [7]:
"In the early 21st century, when he was in his forties, NT2 was involved in a controversial business that was the subject of public opposition over its environmental practices. Rather more than ten years ago he pleaded guilty to two counts of conspiracy in connection with that business, and received a short custodial sentence. The conviction and sentence were the subject of reports in the national and local media at the time. NT2 served some six weeks in custody before being released on licence. The sentence came to an end over ten years ago. The conviction became "spent" several years ago. The original reports remained online, and links continued to be returned by Google Search. NT2's conviction and sentence have also been mentioned in some more recent publications about other matters, two of them being reports of interviews given by NT2. In due course, NT2 asked Google to remove such links."
NT2's solicitors submitted a delisting request on 14 Apr 2015. It related to 8 links. Google responded promptly by email, on 23 Apr 2015 declining to delist, saying that the links in question "relate to matters of substantial public interest to the public regarding [NT2's] professional life". On 24 June 2015, NT2's solicitors sent a letter of claim. On 2 Oct 2015 they issued proceedings, claiming relief in respect of the same 8 links as NT2. In the course of the proceedings, complaints about a further 3 links were added to the claim. The claim advanced by NT2 therefore relates to 11 items.

The Issues
The judge summarized the issues in dispute in each case as follows at para [9]:
"(1) whether the claimant is entitled to have the links in question excluded from Google Search results either
(a) because one or more of them contain personal data relating to him which are inaccurate, or
(b) because for that and/or other reasons the continued listing of those links by Google involves an unjustified interference with the claimant's data protection and/or privacy rights; and 
(2) if so, whether the claimant is also entitled to compensation for continued listing between the time of the delisting request and judgment."
His lordship added:
"Put another way, the first question is whether the record needs correcting; the second question is whether the data protection or privacy rights of these claimants extend to having shameful episodes in their personal history eliminated from Google Search; thirdly, there is the question of whether damages should be paid."
The judge noted at para [10] that these were novel questions that had never been considered by the courts. They arose in a legal environment which was complex and had developed over time.

The Legal Framework
At para [13] of his judgment Mr Justice Warby set out the legal framework:
  1. The European Convention on Human Rights ("the Convention") and in particular art 8 and art 10;
  2. S.3 (1) of the European Communities Act 1972 requiring courts in the UK to make decisions on matters of EU law in accordance with decision of the Court of Justice of the European Union ("CJEU");
  3. The Rehabilitation of Offenders Act 1974 which provide for certain convictions to be spent after specified periods of time;
  4. The Data Protection Directive and in particulars arts 2, 6, 8, 9, 12, 14, 23 and 29;
  5. The Data Protection Act 1998 and its implementing regulations;
  6. The Human Rights Act 1998 which imported the Convention into English law;
  7. The decisions of the House of Lords in Campbell v MGN Ltd [2004] AC 457, [2004] EMLR 15, [2004] 2 AC 457, [2004] UKHRR 648, [2004] 2 All ER 995, [2004] HRLR 24, [2004] UKHL 22, 16 BHRC 500, [2004] 2 WLR 1232 and Re S (a child) [2004] UKHL 47, [2004] 3 WLR 1129, [2004] 4 All ER 683, [2004] 3 FCR 407, [2005] AC 593, [2005] HRLR 5, 17 BHRC 646, [2005] EMLR 2, [2005] Crim LR 310, [2005] 1 FLR 591, [2005] EMLR 11, [2005] 1 AC 593, [2005] UKHRR 129;
  8. The Charter of Fundamental Rights of the European Union (OJ 18.12.2000 C 364/1);
  9. The decision of the CJEU in C-131/12 Mario Costeja Gonzalez v Google Spain and another EU:C:2014:317, [2014] 3 WLR 659, [2014] EUECJ C-131/12, [2014] All ER (EC) 717, [2014] EMLR 27, [2014] 3 CMLR 50, [2014] ECDR 16, [2014] 2 All ER (Comm) 301, ECLI:EU:C:2014:317, [2014] 1 QB 1022, 36 BHRC 589, [2014] QB 1022; and
  10. The General Data Protection Regulation ("CDPR") and in particular art 17.
NT1's Contentions
NT1 contended that Google was a "data controller" within the meaning of s.1 (1) of the Data Protection Act 1998 and that it owed him a duty under s.4 (4) to process data relating to him in accordance with the "data protection principles" as set out in Sched. 1 to the  Act.  He complained that Google had breached the 1st, 4th, 5th and 6th principles:
"1.  Personal data shall be processed fairly and lawfully and, in particular, shall not be processed unless
(a)   at least one of the conditions in Schedule 2 is met, and
(b)   in the case of sensitive personal data, at least one of the conditions in Schedule 3 is also met.
.......................................
4.   Personal data shall be accurate and, where necessary, kept up to date.
5.   Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes.
6.   Personal data shall be processed in accordance with the rights of data subjects under this Act."
NT1 alleged that Google had breached the 4th data protection principle by linking to 6 articles that contained inaccuracies.  In the alternative he argued that linking to those articles breached one or more of the other principles.

Google's Response to NT1
Google admitted that it was a data controller, that it owed a duty to comply with the data protection principles, that there were inaccuracies in the 6 articles and that it had to balance the interests of the data subject against those of public in accordance with the CJEU's judgment in the Google Spain case but offered the following three-pronged defence.  First, it argued that NT1's claim was an abuse of the process of the court on the ground that NT1 was using the Data Protection Act 1998 to obtain relief for damage to reputation that would be unavailable in defamation proceedings by reason of s.8 of the Rehabilitation of Offenders Act 1974.  Secondly, it contended that it had carried out properly the balancing exercise prescribed by the CJEU in Google Spain.  Thirdly, it relied on s.32 of the 1998 Act which provides a limited exemption for journalistic purposes.

The Judgment on NT1's Claim
Though he rejected Google's abuse of process argument and its journalistic purpose defence, Mr Justice Warby found for Google,

As to the abuse of process argument, his lordship agreed that NT1 had brought this action in order to protect his reputation but he was also relying on the CJEU's decision in Google Spain. He accepted NT1's submission that "the Court should not be too liberal in its labelling of prejudice as 'injury to reputation', lest it undermine the Google Spain regime."

He rejected the s.32 defence on the ground that Google's processing of personal data was not only for journalistic, literary or artistic purposes as required by s.32 (1).  Google was a search engine and as such it processed personal data for all sorts of purposes.  Secondly, s.32 (1) (b) and (c) required "reasonable belief" on Google's part that publication would be in the public interest and that complying with the data protection principles would be incompatible with the journalistic, literary or artistic purposes. The judge could find no evidence that Google had given any thought to the public interest.

At paragraph [93] of his judgment, the judge considered whether the 4th data protection principle had been breached.  He found a few inaccuracies in the articles to which NT1 objected but they were not so serious as to give a false impression of the crimes of which he had been convicted.  Some of NT1's complaints had been insufficiently pleaded.  Other complains were not supported by the evidence.  Even where there were inaccuracies the judge was not persuaded to make any of the orders sought or to award compensation.

As to the other alleged breaches his lordship held that the public interest outweighed the case for delisting. He explained his reasoning at para [170]:
"The information retains sufficient relevance today. He has not accepted his guilt, has misled the public and this Court, and shows no remorse over any of these matters. He remains in business, and the information serves the purpose of minimising the risk that he will continue to mislead, as he has in the past. Delisting would not erase the information from the record altogether, but it would make it much harder to find. The case for delisting is not made out."
For much the same reason he held that the public interest outweighed NT1's reasonable  expectation of privacy under art 8 of the Convention.

The Judgment on NT2's Claim
Although the issues in his case were much the same as in NT1's, NT2 impressed the judge as "an honest and generally reliable witness who listened carefully to the questions put to him, and gave clear and relevant answers." The one article, or item, of which the claimant complained was not a contemporary report of the conviction or sentencing. It had appeared in a national newspaper over 8 years after NT2 had been sentenced.  The judge found at para [190] that the article was inaccurate and gave a misleading complaint as to the claimant's criminality. Because of the inaccuracy the judge was prepared to make a delisting order.

After performing the Google Spain balancing exercise, the judge concluded at [223]:
"My key conclusions in respect of NT2's delisting claim are that the crime and punishment information has become out of date, irrelevant and of no sufficient legitimate interest to users of Google Search to justify its continued availability, so that an appropriate delisting order should be made. The conviction was always going to become spent, and it did so in March 2014, though it would have done so in July of that year anyway. NT2 has frankly acknowledged his guilt, and expressed genuine remorse. There is no evidence of any risk of repetition. His current business activities are in a field quite different from that in which he was operating at the time. His past offending is of little if any relevance to anybody's assessment of his suitability to engage in relevant business activity now, or in the future. There is no real need for anybody to be warned about that activity."
As to whether NT2 had a reasonable expectation of privacy under art 8 of the Convention, Mr Justice Wardle  said at [226]:
"The impact on the claimant is such as to engage Article 8. The business prejudice does not suffice for that purpose, but there is just enough in the realm of private and family life to cross the threshold. The existence of a young, second family is a matter of some weight. Even so, the evidence does not, in the end, demonstrate a grave interference. But it is enough to require a justification. Google's case on relevance is very weak. The claimant's evidence suggests that he has acknowledged his past error. The claimant's current and anticipated future business conduct does not make his past conduct relevant to anybody's assessment of him, or not significantly so. Continued accessibility of the information complained of is hard to justify. The factors that go to support that view are weak, by comparison with those that weigh in favour of delisting."
Though the judge decided to make a delisting order he was not persuaded to award compensation as he considered that Google had acted with reasonable care in dealing with NT2's request.

Comment
These are two cases with very similar issues and arguments but significantly different facts  NT2's wrongdoing was of a lesser order than NT1's. He had expressed contrition. The article of which NT2 complained had been inaccurate and misleading whereas those of which NT1 complained were not.  Unlike NT1, NT2 was trying to rebuild his life in a different business  where there was no danger of his repeating his wrongdoing. He was therefore reasonably entitled to privacy.  That is why the balance tipped in NT2's favour but not NT1's.

The judgment is useful in that it lists the authorities to which the court will have regard in future cases and the methodology to be applied in Google Spain cases.  Save that courts will cease to consider the Data Protection Directive and the Data Protection Act 1998 as part of the legal framework after 25 May 2018 the approach to issues of this kind will probably be the same under the GDPR.

Finally, Mr Justice Wardle's decision has been widely reported as a defeat for Google (eg Google loses "right to be forgotten" case 13 Apr 2018), that is not completely true. Google was completely successful in NT1's case and resisted the claim for compensation in NT2's.

Should amplification or clarification be required, call me on 020 7404 5252 during office hours or send me a message through my contact form.

Tuesday, 27 March 2018

Information Commissioner's Charges after GDPR

Bank of England
Author Adrian Pingstone
Licence Copyright waived by owner
Source Wikipedia























Jane Lambert

The General Data Protection Regulation ("GDPR") imposes a number of new obligations on data controllers but it does not require them to pay any money unlike the Data Protection Act 1984 and the Data Protection Act 1998.  A small but very welcome concession in exchange for responsibilities that will increase the costs of compliance one might have thought.

Fat chance! Our own Parliament has passed the Digital Economy Act 2017 section 108 (1) of which enables the Secretary of State to make regulations that "require data controllers to pay charges of an amount specified in the regulations to the Information Commissioner." The government has now published draft regulations under that provision known as The Data Protection (Charges and Information) Regulations 2018 which will come into effect on 25 May 2018. The Explanatory Note  states that they will replace The Data Protection (Notification and Notification Fees) Regulations 2000 SI 2000 No 188.

According to the Information Commissioner;s press release, this legislation has been enacted because the government has a statutory duty to ensure that the Information Commissioner's Office is adequately funded (see New model announced for funding the data protection work of the Information Commissioner’s Office 21 Feb 2018 ICO's News and Blogs). They have a point there.  I for one was heartened by photos of ICO investigators doing their job in relation to recent personal data misuse allegations (see Investigators complete seven-hour Cambridge Analytica HQ search 24 March 2018 The Guardian).

The amount of the new charges is set out in reg 3 (1):
"For the purposes of regulation 2 (2), the charge payable by a data controller in—
(a) tier 1 (micro organisations), is £40;
(b) tier 2 (small and medium organisations), is £60
(c) tier 3 (large organisations), is £2,900."
To qualify as a "micro organisation" a business must:
(i)  have a turnover of less than or equal to £632,000 for the data controller’s financial year,
(ii) no more that 10 members of staff;
(iii) be a charity, or
(iv) be a small occupational pension scheme.
As micro organizations will be offered a £5 discount if they pay by direct debit, the new rules will not increase their payments at all if they take advantage of the concession.   For businesses in tier 3 there will be a massive increase from £500 to £2,900 per year.   The new rates are intended to reflect the relative risk for each category of data controller.

Anyone wishing to discuss these rules or data protection in general should call me on 020 7404 5252 during office hours or send me a message through my contact form.