Follow Cyberleagle on Feedspot

Continue with Google
Continue with Facebook


A preview of some of the UK internet legal developments that we can expect in 2018. Any future EU legislation will be subject to Brexit considerations and may or may not apply in the UK.

EU copyright reform In 2016 the European Commission published proposals for

-         a Directive on Copyright in the Digital Single Market. As it navigates the EU legislative process the proposal continues to excite controversy, mainly over the proposed publishers’ ancillary right and the clash between Article 13 and the ECommerce Directive's intermediary liability provisions.  

-         a Regulation extending the country of origin provisions of the Satellite and Cable Broadcasting Directiveto broadcasters' ancillary online transmissions. Most of the Commission’s proposal was recently rejected by the European Parliament.

-         legislation to mandate a degree of online content portability within the EU. The Regulation on cross-border portability of online content services in the internal market was adopted on 14 June 2017 and will apply from 20 March 2018.
EU online business As part of its Digital Single Market proposals the European Commission published a proposal for a Regulation on "Geo-blocking and other forms of discrimination". It aims to prevent online retailers from discriminating, technically or commercially, on the basis of nationality, residence or location of a customer. Political agreement was reached in November 2017. The Regulation would come into force nine months from publication in the EU Official Journal.

Telecoms privacy The proposed EU ePrivacy Regulation continues to make a choppy voyage through the EU legislative process.

Intermediary liability On 28 September 2017 the European Commission published a Communication on Tackling Illegal Content Online.  This is a set of nominally voluntary guidelines under which online platforms would adopt institutionalised notice and takedown/staydown procedures and proactive content filtering processes, based in part on a system of 'trusted flaggers'. The scheme would cover every kind of illegality from terrorist content, through copyright to defamation. The Commission aims to determine by May 2018 whether additional legislative measures are needed.
Politicians have increasingly questioned the continued appropriateness of intermediary liability protections under the Electronic Commerce Directive. The UK Committee on Standards in Public Life has suggestedthat Brexit presents an opportunity to depart from the Directive. The government has published its Internet Safety Strategy Green Paper. More to come in 2018.

The hearing of the appeal to the UK Supreme Court in Cartier on who should bear the cost of complying with site blocking injunctions should be heard during 2018.
TV-like regulation of the internet The review of the EU Audio Visual Media Services Directive continues. The Commission proposal adopted on 25 May 2016 would further extend the Directive's applicability to on-demand providers and internet platforms.

Pending CJEU copyright cases More copyright references are pending in the EU Court of Justice. Issues under consideration include whether the EU Charter of Fundamental Rights can be relied upon to justify exceptions or limitations beyond those in the Copyright Directive; and whether a link to a PDF amounts to publication for the purposes of the quotation exception (Spiegel Online GmbH v Volker Beck, C-516/17). Another case on the making available right (Renckhoff, C-161/17) is pending. It is also reportedthat the Dutch Tom Kabinet case on secondhand e-book trading has been referred to the CJEU.
ECommerce Directive Two cases involving Uber are before the CJEU, addressing in different contexts whether Uber’s service is an information society service within the Electronic Commerce Directive. Advocate General Szpunar gave an Opinionin Asociación Profesional Élite Taxi v Uber Systems Spain, C-434/15 on 11 May 2017 and in Uber France SAS, Case C‑320/16 on 4 July 2017.
Online pornography The Digital Economy Act 2017 grants powers to a regulator (recently formally proposed to be the British Board of Film Classification) to determine age control mechanisms for internet sites that make ‘R18’ pornography available; and to direct ISPs to block such sites that either do not comply with age verification or contain material that would not be granted an R18 certificate. The DCMS has publisheddocuments including draft guidance to the Age Verification Regulator.

Cross-border liability and jurisdictionIlsjan (Case C-194/16) is another CJEU reference on the Article 7(2) (ex-Art 5(3)) tort jurisdiction provisions of the EU Jurisdiction Regulation. The case concerns a claim for correction and removal of harmful comments. It asks questions around mere accessibility as a threshold for jurisdiction (as found in Pez Hejduk) and the eDate/Martinez ‘centre of interests’ criterion for recovery in respect of the entire harm suffered throughout the EU. The AG Opinion in Ilsjanwas delivered on 13 July 2017.
The French CNIL/Google case on search engine de-indexing has raised significant issues on extraterritoriality, including whether Google can be required to de-index on a global basis. The Conseil d'Etat has referred various questions about this to the CJEU.

Online state surveillance The UK’s Investigatory Powers Act 2016 (IP Act), partially implemented in 2016 and 2017, is expected to come fully in force in 2018. However the government has acknowledged that the mandatory communications data retention provisions of the Act are unlawful in the light of the Watson/Tele2 decision of the CJEU. It has launched a consultationon proposed amendments to the Act, including a new Office for Communications Data Authorisation to approve requests for communications data . Meanwhile a reference to the CJEU from the Investigatory Powers Tribunal questions whether the Watsondecision applies to national security, and if so how.
The IP Act (in particular the bulk powers provisions) may also be indirectly affected by cases in the CJEU (challenges to the EU-US Privacy Shield), in the European Court of Human Rights (various NGOs challenging the existing RIPA bulk interception regime) and by a judicial review by Privacy International of an Investigatory Powers Tribunal decision on equipment interference powers. However in that case the Court of Appeal has heldthat the Tribunal decision is not susceptible of judicial review.  One of the CJEU challenges to the EU-US Privacy Shield was held by the General Court on 22 November 2017 to be inadmissible for lack of standing.
Liberty's challenge by way of judicial review to the IP Act bulk powers and data retention powers is pending.
Compliance of the UK’s surveillance laws with EU Charter fundamental rights will be a factor in any data protection adequacy decision that is sought once the UK becomes a non-EU third country post-Brexit.

[Update 18 Dec. Replaced 'EU law' in last para with 'EU Charter fundamental rights'.]
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
15 questions to illuminate the festive season. Answers in the New Year. (Remember that this is an English law blog). 

Tech teasers 

1. How many data definitions does the Investigatory Powers Act 2016 (IP Act) contain?

2. A technical capability notice (TCN) under the IP Act could prevent a message service from providing end to end encryption to its users. True, False or Maybe?

3. Under the IP Act a TCN requiring installation of a permanent equipment interference capability could be served on a telecommunications operator but not a device manufacturer. True, False or Maybe?

4. Who made a hash of a hashtag?

Brave new world

5. Who marked the new era of post-Snowden transparency by holding a private stakeholder-only consultation on a potentially contentious IP Act draft Statutory Instrument?

6. Who received an early lesson in the independence of the new Investigatory Powers Commissioner?

The penumbra of ECJ jurisdiction
7. The EU Court of Justice (CJEU) judgment in Watson/Tele2 was issued 22 days after the IP Act received Royal Assent. How long elapsed before the Home Office published proposals to amend the Act to take account of the decision?

8. The Investigatory Powers Tribunal has recently made a referral to the CJEU. What is the main question that the CJEU will have to answer about the scope of its Watson decision?  
9. What change was made in the IP Act’s bulk powers, compared with S.8(4) RIPA, that would render the CJEU’s Q.8 answer especially significant?

10. After Brexit we won't need to worry about CJEU surveillance judgments, even if we exit the EU with no deal. True, False or Maybe? 

Copyright offline and online

11. Tweeting a link to infringing material is itself an infringement of copyright. True, False or Maybe?  
12. Reading an infringing copy of a paper book is not copyright infringement. Viewing an infringing copy online is. True, False or Maybe?
13. Whereas selling a set-top box equipped with PVR facilities is legal, providing a cloud-based remote PVR service infringes copyright. True, False or Maybe?

14. Format-shifting infringes copyright. True, False or Maybe?

15. Illegal downloading is a crime. True, False or Maybe?

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Over the last four months the Law Commission of England and Wales has been consulting on the topic of Making a Will, focusing on testamentary capacity and formalities.  Chapter 6 of the Consultation is about Electronic Wills. This is my submission on that topic, from the perspective of a tech lawyer who knows little of the law of wills but has grappled many times with the interaction of electronic transactions and formalities requirements.

Introductory Remarks

1. The question at the core of Chapter 6 of the Consultation is how to give effect to testamentary intentions in an increasingly electronic environment. This has at least five aspects, which inevitably conflict with each other to an extent:
  • Providing a reasonable degree of certainty that the testator intended the document in question to have the significant legal effects of a will. This is achieved by requiring a degree of formality and solemnity.
  • Ensuring that formalities do not act as a deterrent to putative testators whether through complexity, cost, consumption of time or uncertainty as to how to achieve compliance.
  • Minimising the risk of a testator’s intentions being defeated by an actual failure to comply with formalities or an inability to demonstrate that formalities were in fact complied with.
  • Providing protection against fraud, tampering and forgery, either of the body of the document or of the signature(s) appended to it.
  • Providing for all the above over the potentially long period of time between execution of the will and its being admitted to probate.
2. The tensions between these requirements necessitate a balance to be drawn that will not perfectly satisfy any of them, as is the case with the current regime designed for an offline environment.

Signatures versus other formalities

3. Although the focus of electronic transactions regimes tends to be on signatures, signatures should not be addressed in isolation from other relevant formalities[1]. As the Consultation Paper recognises, there is interaction and dependency between signature, form, medium and process. Although the Consultation Paper does not categorise them as such, for wills formalities of all four kinds exist:
  • Signature: the need for signatures and the (possible) requirement that the signature be handwritten (Consultation Paper 6.20 to 6.30)
  • Form: the caselaw requirement for an attestation clause if a strong presumption of due execution is to arise (Consultation Paper 5.11 to 5.12; confusion around the witness attestation requirement is addressed elsewhere in the Consultation paper.)
  • Medium: the requirement that the will be in writing (Consultation Paper 6.15 to 6.19)
  • Process: the presence and simultaneity requirements for witnessing (Consultation Paper 6.32); and the practical filing requirements for admission to probate (6.97).
4. However the Consultation Paper is not always convincing about the relative importance of these formalities.  Thus in bringing home to the testator the seriousness of the transaction, the ceremony of gathering two witnesses in the same room simultaneously to witness the testator’s signature would seem likely to be more significant than whether or not the signature is handwritten (cfConsultation Paper 6.48, 6.64). If it had to be done in the presence of two witnesses appending a signature to an electronic document using (for instance) a tablet would surely be no less a matter of consequence than applying a handwritten signature to a paper document.

5. The overall purpose of giving effect to the testator’s intention where electronic methods are involved may be achievable by an appropriate combination of all four kinds of formality. Not all (or even most of) the heavy lifting has necessarily to be done by the signature itself, any more than with a traditional paper will.

The function of a signature

6. The function of a signature is generally threefold: (1) to indicate assent to or an intention to be bound by the contents of the document, (2) to identify the person so doing and (3) to authenticate the document. (There are variations on these functions. For instance the signature of a witness does not indicate assent or an intention to be bound, but instead is intended to verify the signature of the party to the document.)

7. The difference between the identification and authentication functions can be seen if we consider the different kinds of repudiation that may occur. Identification protects against the claim: ‘That is not my (or X’s) signature’.  Authentication protects against the claim: ‘That is my (or X’s) signature, but that is not the document that I (or X) signed’.

Strengths and weaknesses of electronic signatures

8. As the Consultation Paper notes, ordinary electronic signatures (typed names, copied scans) are poor identifiers and authenticators. Nevertheless English law, in keeping with its historically liberal attitude to formalities requirements generally, rightly regards such signatures as adequate in most cases in which a signature is required by statute. Manuscript signatures are better, but not perfect, identifiers and authenticators. A properly formed manuscript signature is better than a mark, but both are valid.

9. At the other end of the scale of sophistication, certificate-based digital signatures are very good (far better than manuscript signatures) at authenticating the signed document.  However they remain relatively poor at assuring the identity of the person who applied the digital signature. This is because however sophisticated may be the signature technology, access to the signature creation device will (in the absence of a biometric link) be secured by a password, a PIN, or something similar. As the Consultation Paper rightly points out these are weak forms of assurance (Consultation Paper 6.60 to 6.68). This aspect can be improved by adopting methods such as two factor authentication of the user. It may or may not be apparent after the event whether such a technique was used.

Common traps in legislating for electronic transactions

Over-engineering and over-estimating the reliability of non-electronic systems

10. The Consultation Paper refers to the apparently stillborn attempt to legislate for electronic wills in Nevada. I am not familiar with the particular legislation in question, but will offer some general comments about the temptation for legislation to impose over-engineered technical solutions.

11. Over-engineering is a natural consequence of over-estimating the reliability of non-electronic systems and thus, in the name of equivalence, attempting to design in a level of assurance for the electronic system that does not exist in the non-electronic sphere.  As the Australian Electronic Commerce Expert Group stated in its 1998 Report to the Attorney-General[2]:
“There is always the temptation, in dealing with the law as it relates to unfamiliar and new technologies to set the standards required of a new technology higher than those which currently apply to paper and to overlook the weaknesses that we know to inhere in the familiar.”
12. Over-engineering occurred in the early days of digital signatures, when complex statutes were passed in some jurisdictions (the Utah Digital Signatures Act being the earliest and best known example) in effect prescribing the use of PKI digital signatures in an attempt to achieve a guarantee of non-repudiation far beyond that provided by manuscript signatures. These kinds of rules were found to be unnecessary for everyday purposes and have tended to be superseded by facilitative legislation such as the US ESign Act.

Over-technical formalities requirements

13. Over-technical formalities requirements are a potential source of concern. This is for two reasons. 

14. First, they increase the chance that a putative testator or a witness will make an error in trying to comply with them. As the Sixth Interim Report of the Law Revision Committee said in 1937 in relation to the Statute of Frauds:
" 'The Act', in the words of Lord Campbell . . . 'promotes more frauds than it prevents'. True it shuts out perjury; but it also and more frequently shuts out the truth. It strikes impartially at the perjurer and at the honest man who has omitted a precaution, sealing the lips of both. Mr Justice FitzJames Stephen ... went so far as to assert that 'in the vast majority of cases its operation is simply to enable a man to break a promise with impunity, because he did not write it down with sufficient formality.’ " 
15. Second, a person attempting to satisfy the formalities requirements must be able to understand how to comply with them without resort to expert technical assistance, and to be confident that they have in fact complied. A formalities specification that requires the assistance of an IT expert to understand it will deter people from using the procedure and increase the incidence of disputes for those who do so. Injustice will be caused if the courts are filled with disputes about whether the right kind of electronic signature has been used and where there is no real doubt about the identity of the testator and the authenticity of the will.


16. As a general rule technology-neutral legislation is preferable to technology-specific legislation.

17. This is for two reasons. First, technology-specific legislation can be overtaken by technological developments, with the result either that it is uncertain whether a new technology complies with the requirements, or that the legislation may clearly exclude the new technology even though functionally it performs as well or better than the old technology. Second, technology-specific legislation tends to lock in particular technology vendors rather than opening the market to all whose offerings are able to provide the required functionality (cf Consultation paper 6.36 and 6.37).

18. Against that, however, is the concern that if legislation is drafted at a very high level of abstraction in order to accommodate possible future technologies, it carries the price of uncertainty as to whether any given technology does or does not comply with the formalities requirements. That is most undesirable, for the reasons set out above.

19. Reconciling these opposing considerations is no easy task. Indeed it may be impossible to achieve a wholly satisfactory resolution. Nevertheless the competing considerations should be recognised and addressed.

Validity versus evidence

20. Validity and evidence have to be considered separately. Validity is not a matter of evidential value. Whilst the overall purpose of a formality requirement may be to maximise evidential value and to deter fraud (cf Lim v Thompson), the formality requirement itself stands separate as a rule of validity. 

Commentary on Chapter 6 of Consultation Paper

21. In the light of the introductory discussion above I offer the following comments on some aspects of Chapter 6. I will start with Enabling Electronic Wills (6.33 to 6.43), since that contains some of the most fundamental discussion.

Enabling Electronic Wills (6.33 to 6.43)

6.34 ‘It is highly likely that their use will become commonplace in the future’.

22. Since ‘the future’ is an indeterminate period this is probably a reasonably safe prediction. However, with apologies to Victor Hugo, there is nothing as feeble as an idea whose time has yet to come.

23. Science fiction films from the 1950s and 1960s routinely showed video communicators – an idea that stubbornly refused to take off for another 50 years. Even now video tends to be used for the occasions when seeing the other person is an actual benefit rather than a hindrance – special family occasions, business conferencing, intimate private exchanges for example.

24. Electronic wills have something of that flavour: possible in principle, but why do it when paper has so many advantages: 
  • (Reasonably) Permanent
  • Cheap
  • (Reasonably) secure
  • (Reasonably) private
  • Serious (ceremonial)
  • (Relatively) simple to comply with
25. By contrast electronic wills, as technology currently stands, would be inherently:
  • Impermanent
  • Costly
  • Insecure
  • Less private
  • Casual
  • Complicated to comply with
26. We cannot exclude the possibility that the effort and expense required to overcome, or at least mitigate, these disadvantages may at the present time be out of proportion to the likely benefit. It is perhaps no surprise that stakeholders report little appetite for electronic wills. We should beware the temptation to force the premature take-up of electronic wills simply because of a perception that everything should be capable of being done electronically.    

27. Whilst predictions in this field are foolish, one way in which technology might enable electronic wills in the future is the development (perhaps from existing nascent e-paper technologies) of cheap durable single-use tablets on which an electronic document and accompanying testator and witness signature details could be permanently inscribed and viewed electronically.

28. This is not to say that legislation should not be re-framed now to facilitate the development of appropriate forms of electronic will. Ideally such legislation should capture the essential characteristics of the desired will-making formalities in a technology-neutral but understandable way, rather than prescribe or enable the prescription of detailed systems. In theory it would not even matter if currently there is no technology that can comply with those characteristics electronically.  Such legislation would allow for the possible future development of as yet unknown compliant technologies.

29. However as already discussed, achieving that aim while at the same time leaving a putative testator with no room for doubt about whether a particular technology does or does not satisfy the requirements of the law is not an easy task. It is also pertinent to consider how the presumption of due execution might apply in an electronic context. With paper the presumption arises from matters apparent on the face of the will (Consultation Paper, 5.11). The more technical and complex the formalities requirements for an electronic will, the less will it be possible for compliance with those formalities to be apparent on the face of the document.

6.34 ‘We have focused on electronic signatures’

30. As already indicated, to focus on electronic signatures to the exclusion of the other relevant formalities is, I would suggest, an invitation to error. In reality the Consultation Paper does, of necessity, refer to the other formalities. However it would be preferable explicitly to recognise the interdependence of the four categories of formality and to consider them as a coherent whole.

6.35 ‘First and most importantly, electronic signatures must be secure’

31. This, it seems to me, risks falling into the related traps of over-engineering and of over-estimating the reliability of non-electronic systems (see [10] above).

32. Nor am I sure that the paragraph adequately separates the three functions of a signature discussed above: assent to terms/intention to be bound, identification and authentication.

33. The statement that an electronic signature must provide “strong evidence that a testator meant formally to endorse the relevant document” elides all three functions. The next sentence “electronic signatures must reliably link a signed will to the person who is purported to have signed it” elides the second and third functions. We then have the statement “Handwritten signatures perform this function well”. It is unclear which function or functions are being referred to. Handwritten signatures do not perform each function equally well.

34. It is true that a (genuine) handwritten signature, buttressed by the surrounding formality of double witnessing, is strong evidence of intention to be bound.

35. A well-formed handwritten signature (a ‘distinctive mark’, in the words of the Consultation Paper) provides reasonably strong evidence of identity, assuming that comparison handwriting can be found (something not required by the Wills Act and so more in the nature of a working assumption - cf para 6.53 of the Consultation paper). A mark (which is permissible under the Wills Act) does not do so. The witnesses (if available) are also relevant to proof of identity.

36. Parenthetically, one wonders whether the evidential weight assumed to be provided by signatures may have changed over the period since the enactment of the Wills Act 1837. The use of marks may have been more widespread than today and forensic techniques must have been less advanced. Do we now attribute greater reassurance to the use of a handwritten signature than was originally the case?  At any event, given the wide degree of latitude allowed to the form of a handwritten signature the degree of assurance cannot be regarded as uniform across all handwritten signatures.

37. A handwritten signature is weak evidence of linkage to the document. The signature is present only on the page on which it appears. Proof of the integrity of the whole document (if required) would depend on factors that have nothing to do with the signature (e.g. analysis of the paper and typescript ink).

38. Manuscript signatures provide a degree of evidential value for some relevant facts, but they are by no means perfect. It is of course true that a typed signature is of less evidential value than most manuscript signatures. Conversely, as discussed above ([9]) even the most sophisticated electronic signature is only as secure as its weakest link: the password or PIN (or combinations of such), or other mechanisms, that the testator has used to protect the signature key.

39. Notwithstanding its common usage I would tend to avoid the use of the word ‘secure’ in relation to electronic signatures without making clear which function or functions of a signature are being referred to and what precisely is meant, in that context, by ‘secure’.

40. Eliding the related roles of signatures and other formalities is apt to cause unnecessary confusion and, I would suggest, risks unintentionally placing too much of the formalities burden on the electronic signature.

6.35 ‘We have worked on the basis that electronic signatures should be no less secure than handwritten signatures’

41. On the face of it this is unexceptional. However, on closer inspection it suffers in two respects.

42. The first, already mentioned, comes from considering the signature in isolation from other formalities. In principle an electronic signature could permissibly be less secure than a manuscript signature if other formalities were sufficiently strong to compensate. For instance (without necessarily recommending this) the view could be taken that a notarised typewritten electronic signature would be acceptable (if a satisfactory way of notarising electronic documents had been found). The electronic signature itself would be less secure than the manuscript signature, but the combination of formalities could be adequate. Use of a notary instead of witnesses would avoid the authorisation problem identified at Consultation Paper 6.84.

43. The second is that when we break down the functions of the signature, as above [6], then factor in the variations in ‘security’ provided by the range of permissible handwritten signatures, it is quite unclear what is meant by the level of ‘security’ of a handwritten signature.  The temptation (see [11] above) is to over-estimate the security of a handwritten signature when making a comparison of this kind.

6.35 ‘It is essential that a legal mechanism exists for determining which electronic signatures are sufficiently secure, and which are not.’

44. Security (whatever may be meant by that in context) is one aspect of an electronic signature. Given what I have said above about the respective merits of technology-neutral and technology-specific legislation, it is probably inevitable that if the electronic signature itself is to bear any of the formalities burden, there will have to be some definition of which kinds of signature qualify and which do not. This is, however, a potential minefield.  It is almost impossible to define different kinds of signatures at any level of generality in a way that enables a lay person to understand, or that enables an IT expert to say with certainty, what qualifies and what doesn’t. One only has to look at eiDAS and the Electronic Signatures Directive before it to appreciate that.  The ability to be certain that one has complied with the necessary formalities of making a will is surely a sine qua non.

45. At risk of over-repetition it is the whole bundle of formalities, not just the signature, that requires a clear set of legal rules for the electronic environment.

6.36 'There is a risk that narrowly specifying types of valid electronic will could be counterproductive.'

46. Agreed. However see comments above ([16] to [19]) regarding the difficulties of drawing a viable balance between technology-specific and technology-neutral. Also, it is possible (although I have not investigated the matter) that the problem with the existing attempts mentioned in the Consultation might have been over-engineering rather than technology-specificity. Although the two often go hand in hand and over-engineering is always technology-specific, the converse is not necessarily true. A requirement of paper is technology-specific, but not over-engineered.


47. If the principles of clear and understandable requirements for all relevant formalities are adhered to, it ought to follow that any technical method that complies with those formalities is permissible. If all that is being said here is that the requirements must not be so abstract as to create uncertainty as to what does and does not comply, that must be correct (see above [18]).

48. If perhaps this paragraph is recognising that formalities other than the signature itself are relevant, then I would endorse that (see above [3]). Even so this paragraph appears to treat the other formalities as something of an afterthought. This is in my view not a good approach. The better approach is to treat all the formalities as a coherent, interdependent whole.

49. If the last sentence is saying that the law should set out a clear set of formalities for electronic wills, that is one thing. If it is suggesting the establishment of some kind of regulatory body to oversee will-making, that is another matter. Similarly it is unclear what is intended by the reference in 6.39 to ‘regulating’ electronic wills.

6.39 and 6.40

50. See comments on 6.45 below.


51. Witnessing requirements are one of the related formalities discussed above ([3]). Again, however, I believe it is an error to view witnessing requirements as a secondary issue, to be considered consequentially upon the introduction of..
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The European Commission recently published a Communication on Tackling Illegal Content Online.  Its subtitle, Towards an enhanced responsibility of online platforms, summarises the theme: persuading online intermediaries, specifically social media platforms, to take on the job of policing illegal content posted by their users.

The Commission wants the platforms to perform eight main functions (my selection and emphasis):

  1. Online platforms should be able to take swift decisions on action about illegal content without a court order or administrative decision, especially where notified by a law enforcement authority. (Communication, para 3.1)
  2. Platforms should prioritise removal in response to notices received from law enforcement bodies and other public or private sector 'trusted flaggers'. (Communication, para 4.1)
  3. Fully automated removal should be applied where the circumstances leave little doubt about the illegality of the material (such as where the removal is notified by law enforcement authorities). (Communication, para 4.1)
  4. In a limited number of cases platforms may remove content notified by trusted flaggers without verifying legality themselves. (Communication, para 3.2.1)
  5. Platforms should not limit themselves to reacting to notices but adopt effective proactive measures to detect and remove illegal content. (Communication, para 3.3.1)
  6. Platforms should take measures (such as account suspension or termination) which dissuade users from repeatedly uploading illegal content of the same nature. (Communication, para 5.1)
  7. Platforms are strongly encouraged to use fingerprinting tools to filter out content that has already been identified and assessed as illegal. (Communication, para 5.2)
  8. Platforms should report to law enforcement authorities whenever they are made aware of or encounter evidence of criminal or other offences. (Communication, para 4.1)

It can be seen that the Communication does not stop with policing content. The Commission wants the platforms to act as detective, informant, arresting officer, prosecution, defence, judge, jury and prison warder: everything from sniffing out content and deciding whether it is illegal to locking the impugned material away from public view and making sure the cell door stays shut. When platforms aren’t doing the detective work themselves they are expected to remove users’ posts in response to a corps of ‘trusted flaggers’, sometimes without reviewing the alleged illegality themselves. None of this with a real judge or jury in sight. 

In May 2014 the EU Council adopted its Human Rights Guidelines on Freedom of Expression Online and Offline.  The Guidelines say something about the role of online intermediaries.  Paragraph 34 states:
“The EU will … c) Raise awareness among judges, law enforcement officials, staff of human rights commissions and policymakers around the world of the need to promote international standards, including standards protecting intermediaries from the obligation of blocking Internet content without prior due process.” (emphasis added)
A leaked earlier draft of the Communication referenced the Guidelines. The reference was removed from the final version.  It would certainly have been embarrassing for the Communication to refer to that document. Far from “protecting intermediaries from the obligation of blocking Internet content without prior due process”, the premise of the Communication is that intermediaries should remove and filter content without prior due process.  The Commission has embraced the theory that platforms ought to act as gatekeepers rather than gateways, filtering the content that their users upload and read.

Article 15, where are you?
Not only is the Communication's approach inconsistent with the EU’s Freedom of Expression Guidelines, it challenges a longstanding and deeply embedded piece of EU law. Article 15 of the ECommerce Directive has been on the statute book for nearly 20 years. It prohibits Member States from imposing general monitoring obligations on online intermediaries. Yet the Communication says that online platforms should “adopt effective proactive measures to detect and remove illegal content online and not only limit themselves to reacting to notices which they receive”.  It “strongly encourages” online platforms to step up cooperation and investment in, and use of, automatic detection technologies.

A similar controversy around conflict with Article 15 has been stirred up by Article 13 of the Commission’s proposed Digital Single Market Copyright Directive, also in the context of filtering.

Although the measures that the Communication urges on platforms are voluntary (thus avoiding a direct clash with Article 15) that is more a matter of form than of substance. The velvet glove openly brandishes a knuckleduster: the explicit threat of legislation if the platforms do not co-operate.
“The Commission will continue exchanges and dialogues with online platforms and other relevant stakeholders. It will monitor progress and assess whether additional measures are needed, in order to ensure the swift and proactive detection and removal of illegal content online, including possible legislative measures to complement the existing regulatory framework. This work will be completed by May 2018.”
The platforms have been set their task and the big stick will be wielded if they don't fall into line. It is reminiscent of the UK government’s version of co-regulation: 
“Government defines the public policy objectives that need to be secured, but tasks industry to design and operate self-regulatory solutions and stands behind industry ready to take statutory action if necessary.” (e-commerce@its.best.uk, Cabinet Office 1999.)
So long as those harnessed to the task of implementing policy don’t kick over the traces, the uncertain business of persuading a democratically elected legislature to enact a law is avoided.

The Communication displays no enthusiasm for Article 15.  It devotes nearly two pages of close legal analysis to explaining why, in its view, adopting proactive measures would not deprive a platform of hosting protection under Article 14 of the Directive.  Article 15, in contrast, is mentioned in passing but not discussed.  

Such tepidity is the more noteworthy when the balance between fundamental rights reflected in Article 15 finds support in, for instance, the recent European Court of Human Rights judgment in Tamiz v Google ([84] and [85]). Article 15 is not something that can or should be lightly ignored or manoeuvred around.

For all its considerable superstructure - trusted flaggers, certificated standards, reversibility safeguards, transparency and the rest - the Communication lacks solid foundations. It has the air of a castle built on a chain of quicksands: presumed illegality, lack of prior due process at source, reversal of the presumption against prior restraint, assumptions that illegality is capable of precise computation, failure to grapple with differences in Member States' laws, and others.

Whatever may be the appropriate response to illegal content on the internet – and no one should pretend that this is an easy issue – it is hard to avoid the conclusion that the Communication is not it.

The Communication has already come in for criticism (here, here, here, here and here). At risk of repeating points already well made, this post will take an issue by issue dive into its foundational weaknesses.

To aid this analysis I have listed 30 identifiable prescriptive elements of the Communication. They are annexed as the Communication's Action Points (my label). Citations in the post are to that list and to paragraph numbers in the Communication.

Index to Issues and Annex

Presumed illegal
Underlying much of the Communication’s approach to tackling illegal content is a presumption that, once accused, content is illegal until proven innocent. That can be found in:
  • its suggestion that content should be removed automatically on the say-so of certain trusted flaggers (Action Points 8 and 15, paras 3.2.1 and 4.1);
  • its emphasis on automated detection technologies (Action Point 14, para 3.3.2);
  • its greater reliance on corrective safeguards after removal than preventative safeguards before (paras 4.1, 4.3);
  • the suggestion that platforms’ performance should be judged by removal rates (the higher the better) (see More is better, faster is best below);
  • the suggestion that in difficult cases the platform should seek third party advice instead of giving the benefit of the doubt to the content (Action Point 17, para 4.1);
  • its encouragement of quasi-official databases of ‘known’ illegal content, but without a legally competent determination of illegality (Action Point 29, para 5.2). 

Taken together these add up to a presumption of illegality, implemented by prior restraint.

In one well known case a tweet provoked a criminal prosecution, resulting in a magistrates’ court conviction. Two years later the author was acquitted on appeal. Under the trusted flagger system the police could notify such a tweet to the platform at the outset with every expectation that it would be removed, perhaps automatically (Action Points 8 and 15, paras 3.2.1, 4.1).  A tweet ultimately found to be legal would most probably have been removed from public view, without any judicial order. 

Multiply that thousands of times, factor in that speedy removal will often be the end of the matter if no prosecution takes place, and we have prior permanent restraint on a grand scale.

Against that criticism it could be said that the proposed counter notice arrangements provide the opportunity to reverse errors and doubtful decisions.  However by that time the damage is done.  The default has shifted to presumed illegality, inertia takes over and many authors will simply shrug their shoulders and move on for the sake of a quiet life.  

If the author does object, the material would not automatically be reinstated. The Communication suggests that reinstatement should take place if the counter-notice provides reasonable grounds to consider that removed content is not illegal. The burden has shifted to the author to establish innocence.

If an author takes an autonomous decision to remove something that they have previously posted, that is their prerogative. No question of interference with freedom of speech arises.  But if the suppression results from a state-fostered system that institutionalises removal by default and requires the author to justify its reinstatement, there is interference with freedom of speech of both the author and those who would otherwise have been able to read the post. It is a classic chilling effect.

Presumed illegality does not feature in any set of freedom of speech principles, offline or online.  Quite the opposite. The traditional presumption against prior restraint, forged in the offline world, embodies the principle that accused speech should have the benefit of the doubt. It should be allowed to stay up until proved illegal. Even then, the appropriate remedy may only be damages or criminal sanction, not necessarily removal of the material from public view.  Only exceptionally should speech be withheld from public access pending an independent, fully considered, determination of legality with all due process.

Even in these days of interim privacy injunctions routinely outweighing freedom of expression, presumption against prior restraint remains the underlying principle. The European Court of Human Rights observed in Spycatcher that “the dangers inherent in prior restraints are such that they call for the most careful scrutiny on the part of the Court”.

In Mosley v UK the ECtHR added a gloss that prior restraints may be "more readily justified in cases which demonstrate no pressing need for immediate publication and in which there is no obvious contribution to a debate of general public interest". Nevertheless the starting point remains that the prior restraint requires case by case justification. All the more so for automated prior restraint on an industrial scale with no independent consideration of the merits.

Due process at source
A keystone of the Communication is the proposed system of 'trusted flaggers' who offer particular expertise in notifying the presence of potentially illegal content: “specialised entities with specific expertise in identifying illegal content, and dedicated structures for detecting and identifying such content online” (para 3.2.1)

Trusted flaggers “can be expected to bring their expertise and work with high quality standards, which should result in higher quality notices and faster take-downs” (para 3.2.1).  Platforms would be expected to fast-track notices from trusted flaggers. The Commission proposes to explore with industry the potential of standardised notification procedures.

Trusted flaggers would range from law enforcement bodies to copyright owners. The Communication names the Europol Internet Referral Unit for terrorist content and the INHOPE network of reporting hotlines for child sexual abuse material as trusted flaggers. It also suggests that civil society organisations or semi-public bodies are specialised in the reporting of illegal online racist and xenophobic material.

The emphasis is notably on practical expertise rather than on legal competence to determine illegality. This is especially significant for the proposed role of law enforcement authorities as trusted flaggers. The police detect crime, apply to court for arrest or search warrants, execute them, mostly hand over cases to prosecutors and give evidence in court. They may have practical competence in combating illegal activities, but they do not have legal competence to rule on legality or illegality (see below Legal competence v practical competence).

The system under which the Europol IRU sends takedown notices to platforms is illustrative. Thousands - in the case of the UK's similar Counter Terrorism Internet Referral Unit (CTIRU) hundreds of thousands - of items of content are taken down on the say-so of the police, with safeguards against overreaching dependent on the willingness and resources of the platforms to push back. 

It is impossible to know whether such systems are ‘working’ or not, since there is (and is meant to be) no public visibility and evaluation of what has been removed. 

As at July 2017 the CTIRU had removed some 270,000 items since 2010.  A recent freedom of information request by the Open Rights Group for a list of the kind of “statistical records, impact assessments and evaluations created and kept by the Counter Terrorism Internet Referrals Unit in relation to their operations” was rejected on grounds that it would compromise law enforcement by undermining the operational effectiveness of the CTIRU and have a negative effect on national security.

In the UK PIPCU (the Police Intellectual Property Crime Unit) is the specialist intellectual property enforcement unit of the City of London Police. One of PIPCU’s activities is to write letters to domain registrars asking them to suspend domains used for infringing activities. Its registrar campaign led to this reminder from a US arbitrator of the distinction between the police and the courts:
“To permit a registrar of record to withhold the transfer of a domain based on the suspicion of a law enforcement agency, without the intervention of a judicial body, opens the possibility for abuse by agencies far less reputable than the City of London Police. Presumably, the provision in the Transfer Policy requiring a court order is based on the reasonable assumption that the intervention of a court and judicial decree ensures that the restriction on the transfer of a domain name has some basis of “due process” associated with it.”
A law enforcement body not subject to independent due process (such as applying for a court order) is at risk of overreach, whether through over-enthusiasm for the cause of crime prevention, succumbing to groupthink or some other reason. Due process at source is designed to prevent that. Safeguards at the receiving end do not perform the same role of keeping official agencies in check.

The Communication suggests (Action Point 8, 3.2.1) that in ‘a limited number of cases’ platforms could remove content notified by certain trusted flaggers without verifying legality themselves. 

What might these 'limited cases' be?  Could it apply to both state and private trusted flaggers? Would it apply to any kind of content and any kind of illegality, or only to some? Would it apply only where automated systems are in place? Would it apply only where a court has authoritatively determined that the content is illegal? Would it apply only to repeat violations? The Communication does not tell us. Where it would apply, absence of due process at source takes on greater significance.

Would it perhaps cover the same ground as Action Point 15, under which fully automated deletion should be applied where "circumstances leave little doubt about the illegality of the material", for example (according to the Communication) when notified by law enforcement authorities?

When we join the dots of the various parts of the Communication the impression is of significant expectation that instead of considering the illegality of content notified by law enforcement, platforms may assume that it is illegal and automatically remove it.

The Communication contains little to ensure that trusted flaggers make good decisions. Most of the safeguards are post-notice and post-removal and consist of procedures to be implemented by the platforms. As to specific due process obligations on the notice giver, the Communication is silent.

The contrast between this and the Freedom of Expression Guidelines noted earlier is evident. The Guidelines emphasise prior due process. The Communication emphasises ex post facto remedial safeguards to be put in place by the platforms. Those are expected to compensate for absence of due process on the part of the authority giving notice in the first place.

Legal competence v practical competence
The Communication opens its section on notices from state authorities by referring to courts and competent authorities able to issue binding orders or administrative decisions requiring online platforms to remove or block illegal content. Such bodies, it may reasonably be assumed, would incorporate some element of due process in their decision-making prior to the issue of a legally binding order.

However we have seen that the Communication abandons that limitation, referring to ‘law enforcement and other competent authorities’. A ‘competent authority’ is evidently not limited to bodies embodying due process and legally competent to determine illegality.

It includes bodies such as the police, who are taken to have practical competence through familiarity with the subject matter. Thus in the Europol EU Internet Referral Unit “security experts assess and refer terrorist content to online platforms”.

It is notable that this section in the earlier leaked draft did not survive the final edit:
“In the EU, courts and national competent authorities, including law enforcement authorities, have the competence to establish the illegality of a given activity or information online.” (emphasis added)
Courts are legally competent to establish legality or illegality, but law enforcement bodies are not. 

In the final version the Commission retreats from the overt assertion that law enforcement authorities are..
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
A preview [updated with developments as at 7 October 2017] of some of the UK internet legal developments that we can expect in 2017. Any proposed EU legislation will be subject to Brexit considerations and so may never happen in the UK.

EU copyright reform In 2016 the European Commission published proposals for a Directive on Copyright in the Digital Single Market (widely viewed as being in the main internet-unfriendly), for a Regulation extending the country of origin provisions of the Satellite and Cable Broadcasting Directive to broadcasters' ancillary online transmissions and for a proposal to mandate a degree of online content portability within the EU. The legislative processes will continue through 2017. [EU Regulation on cross-border portability of online content services in the internal market adopted 14 June 2017. Applies from 20 March 2018.] 

EU online business As part of its Digital Single Market proposals the European Commission has published a proposal for a Regulation on "Geo-blocking and other forms of discrimination". It aims to prevent online retailers from discriminating, technically or commercially, on the basis of nationality, residence or location of a customer. 

[Intermediary liability The European Commission has published a Communication on Tackling Illegal Content Online.  This is a set of (in name) voluntary guidelines under which online platforms would adopt institutionalised notice and takedown/staydown procedures and proactive content filtering processes, based in part on a system of 'trusted flaggers'. The system would cover every kind of illegality from terrorist content, through copyright, to defamation. The Commission aims to determine by May 2018 whether additional legislative measures are needed.]
UK criminal copyright infringement The Digital Economy Bill is about to start its Lords Committee stage. Among other things the Bill implements the government’s decision to seek an increase in the maximum sentence for criminal copyright infringement by communication to the public from two years to ten years. The Bill also redefines the offence in a way that, although intended to exclude minor infringements, has raised concerns that it in fact expands the scope of the offence. [The Digital Economy Act 2017 received Royal Assent on 27 April 2017. The amendments to the criminal copyright offence come into force on 1 October 2017.]

Pending CJEU copyright cases Several copyright references are pending in the EU Court of Justice. Issues under consideration include communication to the public and magnet links (BREIN/Pirate Bay C-610/15 [CJEU judgment delivered 14 June 2017]), links to infringing movies in an add-on media player (BREIN/Filmspeler C-527/15 [CJEU judgment delivered 26 April 2017]), site blocking injunctions (BREIN/Pirate Bay), applicability of the temporary copies exception to viewing infringing movies (BREIN/Filmspeler) and cloud-based remote PVR (VCAST C-265/16) [Advocate-General Opinion delivered 7 September 2017].

Online pornography The Digital Economy Bill would grant powers to a regulator (intended to be the British Board of Film Classification) to determine age control mechanisms for internet sites that make ‘R18’ pornography available; and to direct ISPs to block such sites that either do not comply with age verification or contain material that would not be granted an R18 certificate. These aspects of the Bill have been criticised by the UN Special Rapporteur on freedom of expression, by the House of Lords Delegated Powers and Regulatory Reform Committee and by the House of Lords Constitution Committee. [The Digital Economy Act 2017 received Royal Assent on 27 April 2017.  The DCMS has published surrounding documents including draft guidance to the Age Verification Regulator.]

Net neutrality and parental controls The net neutrality provisions of the EU Open Internet Access and Roaming Regulation potentially affect the ability of operators to choose to provide network-based parental control filtering to their customers. A transitional period for existing self-regulatory schemes expired on 31 December 2016. The government has said that although it does not regard the Regulation as outlawing the existing UK voluntary parental controls regime, to put the matter beyond doubt it will introduce an amendment to the Digital Economy Bill to put the parental controls scheme on a statutory basis. [Enacted as S.104 of the Digital Economy Act 2017 'Internet filters', which came into force on 31 July 2017.]

TV-like regulation of the internet The review of the EU Audio Visual Media Services Directive continues. The Commission proposal adopted on 25 May 2016 would further extend the Directive's applicability to on-demand providers and internet platforms.

Cross-border liability and jurisdiction Ilsjan (Case C-194/16) is another CJEU reference on the Article 7(2) (ex-Art 5(3)) tort jurisdiction provisions of the EU Jurisdiction Regulation. The case concerns a claim for correction and removal of harmful comments. It asks questions around mere accessibility as a threshold for jurisdiction (as found in Pez Hejduk) and the eDate/Martinez ‘centre of interests’ criterion for recovery in respect of the entire harm suffered throughout the EU. Meanwhile significant decisions on extraterritoriality are likely to be delivered in the French Conseil d'Etat (CNIL/Google) and Canadian Supreme Court (Equustek/Google). [AG Opinion in Ilsjan delivered 13 July 2017. Canadian Supreme Court judgment in Equustek delivered 28 June 2017.  (For commentary see here.) In the French CNIL/Google case the Conseil d'Etat has referred questions on territoriality and remedies to the CJEU.]

Online state surveillance The UK’s Investigatory Powers Act 2016 is expected to be implemented in stages throughout 2017 [and 2018]. The Watson/Tele2 decision of the CJEU has already cast a shadow over the data retention provisions of the Act, which will almost certainly now have to be amended. The Watson case, which directly concerns the now expired data retention provisions of DRIPA, will shortly return to the Court of Appeal for further consideration in the light of the CJEU judgment. [In September 2017 the Investigatory Powers Tribunal decided to make a reference to the CJEU asking questions about the applicability of Watson to national security.] The IP Act (in particular the bulk powers provisions) may also be indirectly affected by pending cases in the CJEU (challenges to the EU-US Privacy Shield), in the European Court of Human Rights (ten NGOs challenging the existing RIPA bulk interception regime) and by a judicial review by Privacy International of an Investigatory Powers Tribunal decision on equipment interference powers. [Judicial review application dismissed 2 February 2017; under appeal to Court of Appeal.] Finally, Liberty has announced that it is launching a direct challenge in the UK courts against the IP Act bulk powers.[The challenge also relates to the IP Act data retention powers. Permission to proceed granted.] [New revised and simplified mindmap of legal challenges as at 7 October 2017 (previous version here):

 [Updated 3 March 2017 with various developments and new mindmap. Further updated 9 August 2017 and 7 October 2017.]

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
The Canadian Supreme Court decision in Equustek and the French Conseil d'Etat decision to make a CJEU reference in Google v CNIL have once again focused attention on the intractable issues around cross-border liability for publication on the internet. 

This is a topic on which I have been writing and speaking off and on since 1996 when I first heard lawyers calling for an international convention to govern cross-border internet liability issues. My view then was that, given the strong inclination of each state to assert the superiority of its own laws over anyone else's, anything that 100-plus governments were able to agree on was unlikely to be good for the internet.  We were better off with continuing chaos. (See further my contribution 'Content on the Internet - Law, Regulation, Convergence and Human Rights' in International Law and The Hague's 750th Anniversary (ed. W.P. Heere. TMC Asser Press 1999.) 

Subsequent experience has, if anything, reinforced that view. And what was originally a debate about information published across borders has, not to its benefit, now become tangled up with the separate jurisdictional question of police and intelligence agency powers to obtain private user data from internet companies in other countries (see e.g. Global Commission on Internet Governance Primer on Internet Jurisdiction (PDF)). 

The root of the intractability is simple.  Unlike any previous medium, the internet is cross border by default and is used, not just as reader but as publisher, by individuals in their hundreds of millions. An internet site is, unless its operator takes positive steps to prevent it by geo-blocking, accessible in any country (excepting those that have erected national firewalls at their borders and banned VPNs). This applies not only to commercial websites but also to individual blogs, tweets, Facebook posts and the like. Accessibility also applies to search engines, which since they operate on the internet are themselves by default accessible worldwide.  

How do nation states respond?  They may accept the possibility that their citizens (or residents and visitors) can, if they try hard enough, find information on the internet that has been created under other countries' laws and which may not be lawful at home (just as when citizens travel abroad and read books not available at home). 

If states reject that possibility they end up either forcing their laws on the citizens of other countries by insisting on worldwide removal, or compelling the site or search engine to geo-block. That holds out the prospect of less permeable borders in cyberspace than existed in the pre-internet physical world (putting on one side the ugly precedents of the Berlin Wall and jamming foreign broadcasts). (See further my chapter 'Cyberborders and the Right to Travel in Cyberspace' in The Net and the Nation State (ed. Uta Kohl, CUP 2017).)  

In a submission to the Leveson Inquiry (PDF) in 2012 Max Mosley said: 
“Anyone using the internet must therefore obey the laws in their country.  Similarly, they should obey the law in countries where their posts appear. As a practical matter, it is the search engines and service providers which can best prevent breaches of the law outside the country of origin of the original post."
As the Equustek and CNIL cases illustrate, the focus is indeed now on search engines, with plaintiffs seeking to leverage their gatekeeper potential not just domestically but on a worldwide basis. 

But Mr Mosley's beguiling proposition begs the same questions that have been asked since the 1990s: should the mere fact that a post appears in another country be enough to trigger the law of that country when the default setting built into the internet is cross-border accessibility? Does it accord with reasonable expectations to require individual bloggers and social media posters to comply with the laws of all countries? Does such a rule strike the right balance from the point of view of readers worldwide, bearing mind the resulting incentive to apply the most restrictive common denominator? My answer is no to all three questions. (See further my September 2012 submission (PDF) to the Leveson Inquiry commenting on Max Mosley's proposal.)

In one respect we have made progress since 1996. In an increasing number of subject matter areas a targeting test has been held (at least within the EU) to define the territorial scope of a right. Targeting rules hold out the prospect of something approaching a peaceful co-existence regime. Properly formulated and applied, a targeting test (a) lays down that mere accessibility does not trigger the laws or jurisdiction of another country and (b) requires relevant positive conduct, not mere omission, in order to do so. (See further my articles Directing and Targeting: The Answer to the Internet's Jurisdiction Problems Computer Law Review International, 2004 and Here, There or Everywhere? Cross-border Liability on the Internet (Computer and Telecommunications Law Review, 2007 C.T.L.R. 41).

However, the furore that periodically erupts around cross-border internet cases shows that there is still little consensus on these issues. Nuanced approaches may be at greatest risk of being jettisoned when the law in question is said to embody a core value of the state asked to adopt an expansive jurisdictional stance. That is also the time when greatest care should be taken not to let enthusiasm for the perceived merits of domestic law override respect for the different laws of other countries and the principle of peaceful co-existence.
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The Supreme Court of Canada has issued its decision in Google Inc v Equustek (28 June 2017). This is the case in which a small Canadian technology company, Equustek, asked the Canadian courts to grant an injunction against the well-known US search engine ordering it to de-index specified websites - not just on its Canadian domain google.ca, but on a worldwide basis. The injunction was an interim order pending trial of Equustek’s action against the operator of the websites. The SCC (by a 7-2 majority) dismissed Google’s appeal and upheld the injunction.

According to taste and point of view the decision is:

(a) a victory for a small Canadian company against a US tech giant

(b) a damaging precedent for future overreaching assertions of extraterritorial jurisdiction by other nation states

(c) a narrowly decided case about interim injunctions with few broad implications

(d) a case that paid insufficient attention to its underlying territorial moorings

(e) a decision that reinforces the role that online intermediaries can and should play in combating unlawful activities

(f) a case heavily influenced by the unattractive behaviour of the operator of the impugned websites, which lays down little in the way of principle to guide future cases

(g) an uncontroversial and well-reasoned illustration of the circumstances in which a court may make orders with extraterritorial effect

(h) another nail in the coffin of worldwide freedom of expression

(i) a pointless exercise in applying national law to the inherently borderless internet.

Commentators critical and supportive (here, here, here, here, here, here and here) have begun to dissect the decision. Some reaction has been less than nuanced. One tweet asked what business a national court had upholding a global injunction, as if no national court had ever issued an injunction with extraterritorial effect before.

Courts have long considered themselves able to grant extraterritorial injunctions. However, out of concern for offending the sensibilities of other territorially sovereign states (comity) they have tended to exercise caution about the circumstances in which they should do so, about the extent of the injunction and to pay close attention to safeguards designed to minimise any possible international conflict.

The difference between Equustek and previous kinds of world-wide injunction (such as asset-freezing orders) is of course the internet. That distinction cuts both ways. In one direction (emphasised by the SCC) it may be said that a worldwide medium requires a worldwide remedy if it is to be effective. In the other direction the internet, as a cross-border vehicle for speech, amplifies and broadens the extraterritorial impact of injunctions aimed at online activity. From that perspective national courts should be more, not less, cautious about extraterritorial effects on the internet.

Comity urges sensitivity to the concerns of other states, including a state's interest in protecting the rights of its own citizens. The British Columbia Court of Appeal judgment in Equustek adopted this description of comity in the Canadian case of Spencer v The Queen:
"Comity” in the legal sense, is neither a matter of absolute obligation, on the one hand, nor of mere courtesy and good will, upon the other. But it is the recognition which one nation allows within its territory to the legislative, executive or judicial acts of another nation, having due regard both to international duty and convenience and to the rights of its own citizens or other persons who are under the protection of its laws ….' (emphasis added)
However in the context of modern day international human rights we are concerned not only with the sensibilities of other states as proxies for their citizens, but with directly respecting the fundamental rights of internet users in other countries. Typically in internet cases those will be rights to privacy (as in the US Microsoft Warrant case) or freedom of expression (as in Equustek). The fundamental rights of internet users are a separate matter from the sensibilities of a nation state. To focus only on state sensitivities risks overlooking or understating the distinct interests of their citizens. (For more on this topic see my chapter in the recently published book ‘The Net and the Nation State’.)

Cases on extraterritorial injunctions tend to resolve themselves into questions not of whether a court has the power to make an extraterritorial order, but whether it should exercise that power and if so how. That is a question of discretion, involving any applicable principles on which discretion should be exercised, and voluntary jurisdictional self-restraint. When faced with a bad actor, an ugly set of facts and a demand for an effective remedy it is all the more important that a court should anxiously examine the basis for exercising its power and carefully identify and balance competing factors, even – perhaps especially - where the internet is concerned.

Whatever the future significance of the Equustek decision (a rich source of jurisprudence or a barren seed destined for obscurity) the factual background to the case is unusual and provides illuminating context for the way in which the Supreme Court of Canada approached the case. (Caveat: my comments are from the perspective of an English lawyer, with no particular knowledge of Canadian law.)

Background and context
The story starts fairly conventionally when Equustek, a manufacturer of electronic networking products, fell out with its distributor Datalink Technologies Gateways Inc ('Datalink'), which then operated from Vancouver. Equustek claimed that for many years Datalink had been re-labelling one of Equustek's products and passing it off as Datalink's own; that Datalink then acquired confidential information and misused it to design and manufacture a competing product; and that Datalink then passed off the competing product by supplying it in substitution for Equustek products advertised on its websites. Equustek terminated the distribution agreement and in April 2011 started litigation in British Columbia against Datalink and its principal.

Initially Datalink defended the proceedings. However the complexion of the dispute changed in 2012 with Datalink abandoning its defence, skipping the jurisdiction, setting up numerous shell companies, operating multiple websites and breaching various orders made by the Canadian courts.

Whatever may have been the merits (or not) of its original defence, Datalink now exhibited the demeanour of a fugitive from justice. Among the injunctions granted against Datalink during 2012 was an order freezing Datalink's assets worldwide. In September 2012 Equustek applied for Datalink and its principal to be found in contempt of court. The Canadian court issued a warrant for the arrest of the principal.

The defences of two of the defendants were struck out in June 2012 for failure to comply with court orders (and the third in March 2013). As the first instance judge (Fenlon J.) observed they were therefore presumed to admit the allegations against them. Although Equustek was given permission in June 2012 to apply for final judgment against Datalink it did not do so. As a consequence the interim orders made by the Canadian court continued in force.

Despite all this Datalink continued, according to the Supreme Court judgment, to carry on business from an unknown location, selling its impugned product on its websites to customers all over the world.

Google entered the picture in September 2012 when Equustek asked it to de-index the Datalink websites. Google refused, following which Equustek applied for an order requiring it to do so.

Google then told Equustek that if it obtained an order against Datalink prohibiting it from carrying on business on the internet, Google would remove specific webpages (but not, in accordance with its internal policy, entire websites).

Shortly afterwards, in December 2012, Equustek (supported by Google) obtained from the Canadian court an injunction against Datalink ordering it to "cease operating or carrying on business through any website". The SCC judgment does not state in terms whether this injunction was itself worldwide. However in the context of Datalink having moved its activities outside Canada, it would be unsurprising if the order were understood to include Datalink websites operated from outside Canada, and not limited to the .ca domain. In any event that is the implication of statements in the judgments that Datalink's activities outside Canada had breached that order.

Parenthetically, a previous judgment of the Canadian Supreme Court (Pro Swing) counsels the need to be explicit about the territorial scope of injunctions when dealing with the internet and territorially defined rights such as trade marks:
"The Internet component does not transform the US trademark protection into a worldwide one. … 
Extraterritoriality and comity cannot serve as a substitute for a lack of worldwide trademark protection. The Internet poses new challenges to trademark holders, but equitable jurisdiction cannot solve all their problems. In the future, when considering cases that are likely to result in proceedings in a foreign jurisdiction, judges will no doubt be alerted to the need to be clear as regards territoriality. Until now, this was not an issue because judgments enforcing trademark rights through injunctive relief were, by nature, not exportable."
Following the December 2012 order against Datalink Google voluntarily removed specific webpages from its .ca search results. Equustek became aware of the limitation to google.ca in May 2013 as the result of cross-examining a Google witness (1st instance judgment [75]).

The order against Datalink requiring it to cease carrying on business on the internet was, as the dissenting judgment in the Supreme Court points out, wider than Equustek's underlying claim against Datalink. That claim was for relief against specific aspects of Datalink's business: using Equustek's trade marks and free-riding on the goodwill of any Equustek product on any website, disparaging or in any way referring to Equustek products, distributing certain manuals and displaying images of Equustek's products on any website; and selling a named line of products alleged to have been created by the theft (sic) of Equustek's trade secrets.

But in the application against Google the effective complaint about Datalink moved away from Equustek's underlying infringement claims. The basis of the decision to grant an order against Google was that Datalink, by continuing business on the internet, was breaching the existing wide interim injunction – an order obtained with the support of Google and which, the Supreme Court says [SCC 34], Google had offered to comply with voluntarily. Third parties with notice of an interim injunction can be treated as if bound by it [SCC 29, 33]. The claim for a de-indexing injunction against Google was therefore for an order piggybacked on a pre-existing broad and, it seems, worldwide injunction against Datalink.

The significance of the order requiring Datalink to cease carrying on business on the internet can be seen at all three judicial levels. Fenlon J. at first instance said that the plaintiffs sought the injunction against Google to prevent continued and flagrant breaches of the court's orders in the underlying action [1st inst. 86]. The BC Court of Appeal described the injunction claimed against Google as 'ancillary relief designed to ensure that orders already granted against the defendants are effective' (emphasis added) [BCCA 2].

The Supreme Court emphasised that in the absence of de-indexing the sites Google was facilitating Datalink’s breach of the order “by enabling it to carry on business through the Internet” (emphasis added). [SCC 34] The de-indexing injunction against Google was said to flow from the necessity of Google’s assistance to prevent the facilitation of Datalink’s ‘ability to defy court orders and do irreparable harm to Equustek’ (emphasis added) [SCC 35].

The specifics of Equustek’s underlying complaints against Datalink and (pertinently, in a case where the appropriateness of a worldwide injunction was in issue) the extent to which they may have been based on territorially limited Canadian rights received relatively little attention.

The fact that an underlying claim is territorially limited does not mean that interim ancillary relief must be similarly limited. Otherwise it would not be possible to grant a worldwide asset-freezing injunction in a case based on a territorially limited right. However, as the minority judgment pointed out, a de-indexing injunction differs from an asset freezing injunction (the rationale for which is to maintain the integrity of the court's process [1st instance 132]) in that it enforces a plaintiff's asserted substantive rights [SCC 72].

In principle any consideration of whether to grant a worldwide de-indexing injunction against an intermediary ought therefore to take into account the nature and territorial extent of the claims made and rights asserted against the alleged wrongdoer. All the interests potentially affected by the grant of the injunction can then be identified and weighed. 

It does not necessarily follow that if the SCC had approached these issues differently the outcome would have changed significantly, or indeed at all. This was after all an appeal against an exercise of the court's discretion, which is entitled to a high degree of deference [SCC 22]. And the underlying defendant’s flouting of court orders was always likely to weigh heavily.

Nevertheless, the minority two judges of the SCC considered that the majority seven judges had not exercised sufficient judicial restraint. They would not have made the order at all. At any event a closer analysis of extraterritoriality might provided a more detailed foundation on which to consider different factual situations in future cases.

The SCC's reasoning

My focus is on three aspects of the SCC's reasoning:
  • The range of interests engaged by an extraterritorial de-indexing injunction
  • Territoriality of underlying claims against Datalink; and
  • Approach to freedom of speech rights.

Interests engaged by a worldwide de-indexing injunction

The SCC discussed worldwide injunctions against offline intermediaries and domestic injunctions against online intermediaries such as ISP site blocking orders. These precedents, however, do not fully address the issues that arise with a global de-indexing injunction against a search engine.

Under existing caselaw ancillary orders may be granted that affect offline intermediaries such as banks. Such an order can cover freezing of a defendant’s assets and disclosure of information identifying bank accounts and about their contents. Both elements can be granted on a worldwide basis. A third party such a bank can be required to assist.

The courts have been at pains both to respect the position of the intermediary as a party not accused of wrongdoing and to minimise the possibility of conflict with foreign law. Thus the standard form English worldwide asset freezing injunction contains several safeguards:

- The Babanaft proviso. The order is only to affect a third party in a foreign country to the extent that the order is declared enforceable by, or is enforced by, a court in that country.

- An undertaking by the applicant not, without the permission of the court, to seek to enforce the order in any country outside England and Wales.

- The Baltic proviso. Nothing in the order, in respect of assets located outside England and Wales, prevents any third party (whether within or outside the jurisdiction) from complying with what it reasonably believes to be its obligation, contractual or otherwise, under the laws and obligations of the country or state in which those assets are situated; and any orders of the courts of that country or state.

These provisions achieve three things. The Babanaft proviso minimises potential comity and conflict of law problems by making the order conditional on enforcement in the foreign court; the undertaking enables the English court to prevent the applicant taking oppressive enforcement action in a foreign country; and the Baltic proviso recognises that even a third party within England and Wales (such as a London bank with a foreign branch where an account is held) ought not to be compelled to do something contrary to foreign law or court order.

The banking system, like the internet, is international. But the courts have recognised, even faced with hard cases such as alleged fraud, the need to pay careful attention to balancing the various state and non-state interests involved.

Asset freezing orders affect fewer interests than does a worldwide de-indexing injunction. A freezing injunction typically affects only the claimant, the defendant and its assets, any third party (such as a bank) that may be holding that defendant’s assets, and potentially the sovereignty of any state in which those assets may be held.

A worldwide de-indexing injunction against a search engine engages a new category: the millions of people around the world who would otherwise be able to seek out the material. This is novel territory, engaging a different kind of interest: freedom of speech at scale.

In Equustek the first instance judge Fenlon J. considered whether a Baltic proviso should be inserted in the order, but decided it was not necessary:
"In the present case, Google is before this Court and does not suggest that an order requiring it to block the defendants' websites would offend California law, or indeed the law of any state or country from which a search could be conducted. Google acknowledges that most countries will likely recognise intellectual property rights and view the selling of pirated products as a legal wrong." [1st inst. 144]… 
"Google was named in this application, served with materials, and attended the hearing. It is not therefore necessary to craft terms anticipating possible conflicts Google could face in complying with the interim injunction. No terms of this kind have been requested by Google and I see no basis on the record before me to expect such difficulties." [1st inst. 160]

The BC Court of Appeal added that "in the unlikely event that any jurisdiction finds the order offensive to its core values, an application could be made to court to modify the order so as to avoid the problem." [BCCA 94]

The SCC similarly relied on Google's ability to apply to modify the order:

"If Google has evidence that complying with such an injunction would require it to violate the laws of another jurisdiction, including interfering with freedom of expression, it is always free to apply to the British Columbia courts to vary the interlocutory order accordingly. To date, Google has made no such application" [SCC 46]

"In the absence of an evidentiary foundation, and given Google’s right to seek a rectifying order, it hardly seems equitable to deny Equustek the extraterritorial scope it needs to make the remedy effective, or even to put the onus on it to demonstrate, country by country, where such an order is legally permissible. We are dealing with the Internet after all, and the balance of convenience test has to take full account of its inevitable extraterritorial reach when injunctive relief is being sought against an entity like Google." (emphasis added) [SCC 47]
Unlike the Babanaft and Baltic provisos this places the burden on the third party to demonstrate that compliance with the injunction would place it in conflict with another state's laws, rather than crafting the injunction so as to minimise the risk of the third party being placed in that position in the first place. Of course asset freezing orders have to anticipate in a vacuum potential difficulties for third parties, since they are made without the presence of the third party in court. In Equustek Google was, as Fenlon J commented, before the court. Whether that is a good basis on which to shift the burden to the third party is a question that will no doubt be revisited in future cases.

The SCC was sceptical of the possibility of conflicts with other jurisdictions' laws. First it observed that there was no harm to Google because it did not have to take steps around the world, but only where its search engine was controlled. [SCC 43]

It went on:

"Google’s argument that a global injunction violates international comity because it is possible that the order could not have been obtained in a foreign jurisdiction, or that to comply with it would result in Google violating the laws of that jurisdiction is, with respect, theoretical. As Fenlon J. noted, 'Google acknowledges that most countries will likely recognize intellectual property rights and view the selling of pirated products as a legal wrong.'" [SCC 44]
The passage refers to comity. It does not address the new element introduced by a worldwide de-indexing injunction, namely potential interference with freedom of speech rights of individual internet users in other countries. This is different from the question of whether the injunction might require the search engine to violate the law of another state. It is quite possible for a search engine’s compliance with an injunction to inhibit users’ access to a website (and thus engage their freedom of speech rights) in another country without the search engine itself contravening the law of that country.

Nor does the passage distinguish between the abstract concept of a state recognising intellectual property rights and the question of what specific rights Equustek may or may not have possessed outside Canada, to which I now turn.

Territoriality of underlying claims
As to territoriality of Equustek's underlying claims, one of the claims was for passing off. A claim for passing off has to be supported by goodwill, which is territorial. A Canadian company doing business in Canada will own goodwill in Canada, plus in any other countries in which it does business. The geographic extent of the plaintiff’s business determines the geographic extent of its rights. The plaintiff does not..
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Eager student:Encryption seems to be back in the news. Why has this come up again?
Scholarly lawyer: It never really went away. Ever since David Cameron sounded off about encryption before meeting Barack Obama in January 2015 it’s been bubbling under.
ES: What did David Cameron say about it?
SL: He said: “In extremis, it has been possible to read someone’s letter, to listen to someone’s call, to listen in on mobile communications, ... The question remains: Are we going to allow a means of communications where it simply is not possible to do that? My answer to that question is: no, we must not.” That sounded very much as if he wanted some kind of encryption ban.
ES: Didn’t Downing Street row back on that?
SL: At the end of June 2015 David Cameron said something very similar in Parliament. Downing Street followed up with: “The prime minister did not suggest encryption should be banned." They said much the same to the BBC in July 2015.
ES: Now the focus seems to be specifically on end to end encryption.
SL: Yes. Amber Rudd said in March this year that E2E encryption was “completely unacceptable”. Downing Street weighed in again: “What the home secretary said yesterday is: where there are instances where law-enforcement agencies wish to gain access to messages which are important to an investigation, they should be able to do so.”
ES: Which brings us to this weekend?
SL: Yes. Amber Rudd has disclaimed any intention to ban end-to-end encryption completely, but at the same time she appears to want providers of E2E encrypted messaging services to provide a way of getting access.
ES: So where does that leave us?
SL: The government evidently wants to do something with end to end encryption. But exactly what is unclear.
ES: Can we ask them to make it clear?
SL: Many have tried. All have failed. That isn’t really surprising, since the very nature of end to end encryption is that the messaging provider has no way of decrypting it.
ES: So if the messaging provider does have a way in, it’s no longer true end to end encryption?
SL: Exactly.
ES: But hasn't end to end encryption been around for years?
SL: In the form of standalone software like PGP, yes. In fact that is what sparked the First Crypto War in the 1990s.
ES: Which ended up with universally available public key encryption?
SL: Exactly. The encryption genie couldn’t be put back in the bottle – you can write a public key encryption algorithm on a T-shirt - and they stopped fighting it.
ES: So what has changed now?
SL: Apps and the cloud. Software such as PGP is an add-on, like anti-virus software.  I make the decision to get PGP from somewhere and to use it with my e-mail. It has nothing to do with my e-mail provider.  But now messaging service providers are incorporating E2E encryption as part of their service.
ES: What difference does that make?
SL: Commercially, the provider will be seen as part of the loop and so as a target for regulatory action. Technically, if the communications touch the provider’s servers someone might think that the provider should be able to access them in response to a warrant.
ES: PGP-encrypted e-mails are also stored in the e-mail provider’s servers, but the provider can't decrypt those.
SL: Certainly. But if the messaging service provider itself provides the ability for me to encrypt my messages as part of its service, then it could be said that it has more involvement. It may store some information on its servers, for instance so that I can set up a connection with an offline user.
ES: If the provider does all that, why can’t it decrypt my messages?
SL: Because I and my counterparty user are generating and applying the encryption keys. With full end to end encryption the service provider never possesses or sees the private key that my app uses to encrypt and decrypt messages.
ES: But that’s the case only for full end to end encryption, right?
SL: Yes, there are other encryption models where the service provider has a key that it could use to decrypt the message.
ES: If it never sees the key and cannot decrypt your message, isn’t the service provider in the same position with end to end encryption as with original PGP? What can the service provider be made to do if it doesn’t have a key?
SL: Now we need to delve into the UK’s interception legislation. Buckle your seatbelt.
ES: Ready.
SL: As you know the new Investigatory Powers Act 2016, like the existing Regulation of Investigatory Powers Act 2000, includes power to serve an interception warrant on a telecommunications operator.
ES: Would that include a messaging provider?
SL: Yes. It shouldn’t include someone who merely supplies encryption software like PGP, but a messaging service provider would be in the frame to have a warrant served on it.
ES: What can a messaging provider be made to do?
SL: It could be required to assist with the implementation of the warrant. If it does have a key, then it could assist by using its key to decrypt any intercepted messages.
ES: Is that a new requirement under the IPAct?
SL: No, RIPA is the same. And even if the provider handed over only an encrypted message, a separate RIPA power could be deployed to make it use its key to decrypt the message.
ES: And if the telecommunications operator doesn’t have a key? How can it assist with the interception warrant?
SL: All it can do is hand over the encrypted message. Both RIPA and the new IPAct say that the telecommunications operator can be required to do only what is reasonably practicable in response to a warrant. If it has no key it cannot be made to do more.
ES: Is that it?
SL: No, the government has one more card, which might be a trump.  Under both the new IP Act and existing RIPA the Minister can serve a notice (a 'technical capability notice', or TCN) on a telecommunications operator requiring it to install a permanent interception capability. This can include the capability to remove any electronic protection applied ‘by or on behalf’ of the telecommunications operator.
ES: Does ‘electronic protection’ include encryption?
SL: Yes. But pay attention to ‘applied by or behalf of’. If the encryption is applied by the user, not the telecommunications operator, then a TCN cannot require the telecommunications operator to remove it.
ES: So a lot could turn on whether, in the particular system used by the operator, the encryption is regarded as being applied by or on behalf of the operator?
SL: Yes. If so, then the TCN can require the operator to have the capability to remove it.
ES: But if the operator doesn’t have a key, how can that be reasonably practicable?
SL: For an operator subject to a TCN who is served with a warrant, reasonable practicability assumes that it has the capability required by the TCN.
ES: So the operator is deemed to be able to do the impossible. How do we square that circle?
SL: A Secretary of State considering whether to issue a TCN has to take into account technical feasibility. Clearly it is not technically feasible for an operator who provides its users with true end-to-end encryption facilities to have a capability to remove the encryption, since it has no decryption key. That might mean that a TCN could not require an operator to do that.
ES: But what if the Secretary of State were to argue that it was technically feasible for the operator to adopt a different encryption model in which it had a key?
SL: Good point.  If that argument held up then the service provider would presumably have to stop offering true end to end encryption facilities in order to comply with a TCN.
ES: Could a TCN be used in that way, to make a telecommunications operator provide a different kind of encryption? Wouldn't that be tantamount to making it provide a different service?
ES: How would we know whether the Secretary of State was trying to do this?
SL: That’s difficult, because a telecommunications operator is required to keep a TCN secret. One possibility is that the new Investigatory Powers Commissioner may proactively seek out controversial interpretations of the legislation that have been asserted and make them public.
ES: Is there a precedent for that?
SL: Yes, the Intelligence Services Commissioner Sir Mark Waller in his 2014 Report discussed whether there was a legal basis for thematic property interference warrants. David Anderson QC’s Bulk Powers Review has supported the idea that the Investigatory Powers Commissioner should do this.
ES: So what happens next?
SL: Draft TCN regulations have recently been consulted on and presumably will be laid before Parliament at some point after the election.  If those are approved, then the ground will have been prepared to approve and serve new TCNs once the IPAct comes into force, which will most likely be later this year.
ES: Thank you.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
The Investigatory Powers Bill, now the newly minted Investigatory Powers Act, has probably undergone more scrutiny than any legislation in recent memory. Rarely, though, can the need for scrutiny have been so great.

Over 300 pages make up what then Prime Minister David Cameron described as the most important Bill of the last Parliament. When it comes into force the IP Act will replace much of RIPA (the Regulation of Investigatory Powers Act 2000), described by David Anderson Q.C.’s report A Question of Trust as ‘incomprehensible to all but a tiny band of initiates’. It will also supersede a batch of non-RIPA powers that had been exercised in secret over many years - some, so the Investigatory Powers Tribunal has found, on the basis of an insufficiently clear legal framework. 
None of this would have occurred but for the 2013 Snowden revelations of the scale of GCHQ’s use of bulk interception powers. Two years post-Snowden the government was still acknowledging previously unknown (except to those in the know) uses of opaque statutory powers. 
Three Reviews and several Parliamentary Committees later, it remains a matter of opinion whether the thousands of hours of labour that went into the Act have brought forth a swan or a turkey. If the lengthy incubation has produced a swan, it is one whose feathers are already looking distinctly ruffled following the CJEU judgment in Watson/Tele2, issued three weeks after Royal Assent. That decision will at a minimum require the data retention aspects of the Act to be substantially amended. 
So, swan or turkey?
Judicial approval
On the swan side warrants for interception and equipment interference, together with most types of power exercisable by notice, will be subject to prior approval by independent Judicial Commissioners. For some, doubts persist about the degree of the scrutiny that will be exercised. Nevertheless judicial approval is a significant improvement on current practice whereby the Secretary of State alone takes the decision to issue a warrant.
Codified powers
Also swan-like is the impressive 300 page codification of the numerous powers granted to law enforcement and intelligence agencies. A Part entitled ‘Bulk warrants’ is a welcome change from RIPA’s certificated warrants, which forced the reader to play hopscotch around a mosaic of convoluted provisions before the legislation would give up its secrets.
Granted, the IP Act also ties itself in a few impenetrable knots. Parts are built on shaky or even non-existent definitional foundations. But it would be churlish not to acknowledge the IP Act’s overall improvement over its predecessors. 
Parliamentary scrutiny
When we move to consider the Parliamentary scrutiny of bulk powers things become less elegant.
The pre-legislative Joint Committee acknowledged that the witnesses were giving evidence on the basis of incomplete information. In response to the Joint Committee’s recommendation the government produced an Operational Case for Bulk Powers alongside the Bill’s introduction into Parliament. That added a little light to that which A Question of Trust had previously shed on the use of bulk powers. 
But it was only with the publication of David Anderson’s Bulk Powers Review towards the end of the Parliamentary process that greater insight into the full range of ways in which bulk powers are used was provided from an uncontroversial source. (By way of example ‘selector’ - the most basic of bulk interception terms - appears 27 times in the Bulk Powers Review, five times in A Question of Trust and twice in the Operational Case, but not at all in either the Joint Parliamentary Scrutiny Committee Report or the Intelligence and Security Committee Report.)
By the time the Bulk Powers Review was published it was too late for the detailed information within it to fuel a useful Parliamentary debate on how any bulk powers within the Act should be framed. David Anderson touched on the timing when he declined to enter into a discussion of whether bulk powers might be trimmed:
“I have reflected on whether there might be scope for recommending the “trimming” of some of the bulk powers, for example by describing types of conduct that should never be authorised, or by seeking to limit the downstream use that may be made of collected material. But particularly at this late stage of the parliamentary process, I have not thought it appropriate to start down that path. Technology and terminology will inevitably change faster than the ability of legislators to keep up. The scheme of the Bill, which it is not my business to disrupt, is of broad future-proofed powers, detailed codes of practice and strong and vigorous safeguards. If the new law is to have any hope of accommodating the evolution of technology over the next 10 or 15 years, it needs to avoid the trap of an excessively prescriptive and technically-defined approach.”
In the event the legislation was flagged through on the Bulk Powers Review’s finding that the powers have a clear operational purpose and that the bulk interception power is of vital utility.
Fully equipped scrutiny at an early stage of the Parliamentary process could have resulted in more closely tailored bulk powers. As discussed below (“Vulnerability to legal challenge”) breadth of powers may come back to haunt the government in the courts.
Mandatory data retention
Views on expanded powers to compel communications data retention are highly polarised. But swan or turkey, data retention will become an issue in the courts. The CJEU judgment in Watson/Tele2, although about the existing DRIPA legislation, will require changes to the IP Act. How extensive those changes need to be will no doubt be controversial and may lead to new legal challenges. So, most likely, will the extension of mandatory data retention to include generation and obtaining of so-called internet connection records: site-level web browsing histories.  
Many would say that officially mandated lists of what we have been reading, be that paper books or websites, cross a red line. In human rights terms that could amount to failure to respect the essence of privacy and freedom of expression: a power that no amount of necessity, proportionality, oversight or safeguarding can legitimise.
Limits on powers v safeguards
The Act is underpinned by the assumption that breadth of powers can be counterbalanced by safeguards (independent prior approval, access restrictions, oversight) and soft limits on their exercise (necessity and proportionality). 
Those may provide protection against abuse. That is of little comfort if the objection is to a kind of intended use: for instance mining the communications data of millions in order to form suspicions, rather than starting with grounds for specific suspicion.
The broader and less specific the power, the more likely it is that some intended but unforeseen or unappreciated use of it will be authorised without prior public awareness and consent. That happened with S.94 of the Telecommunications Act 1984 and, arguably, with bulk interception under RIPA. Certainly, the coming together of the internet and mobile phones resulted in a shift in the intrusion and privacy balance embodied in the RIPA powers. This was facilitated by the deliberate future-proofing of RIPA powers to allow for technological change, an approach repeated (not to its benefit, I would argue) in the IP Act.
In A Question of Trust David Anderson speculated on a future Panopticon of high tech intrusive surveillance powers:
“Much of this is technically possible, or plausible. The impact of such powers on the innocent could be mitigated by the usual apparatus of safeguards, regulators and Codes of Practice. But a country constructed on such a basis would surely be intolerable to many of its inhabitants. A state that enjoyed all those powers would be truly totalitarian, even if the authorities had the best interests of its people at heart.”
He went on to say, in relation to controlling the exercise of powers by reference to fundamental rights principles of necessity and proportionality:
“Because those concepts as developed by the courts are adaptable, nuanced and context-specific, they are well adapted to balancing the competing imperatives of privacy and security. But for the same reasons, they can appear flexible, and capable of subjective application. As a means of imposing strict limits on state power (my second principle, above) they are less certain, and more contestable, than hard-edged rules of a more absolute nature would be.”
The IP Act abjures hard-edged rules. Instead it grants broad powers mitigated by safeguards and by the day to day application of soft limits: necessity and proportionality.
The philosophy of granting broad powers counterbalanced by safeguards and soft limits reflects a belief that, because the UK has a long tradition of respect for liberty, we can and should trust our authorities, suitably overseen, with powers that we would not wish to see in less scrupulous hands. 
Another view is that the mark of a society with a long tradition of respect for liberty is that it draws clear red lines. It does not grant overly broad or far-reaching powers to state authorities, however much we may believe we can trust them (and their supervisors) and however many safeguards against abuse we may install. 
Both approaches are rooted in a belief (however optimistic that may sometimes seem) that our society is founded on deeply embedded principles of liberty. Yet they lead to markedly different rhetoric and results.
Be that as it may, the IP Act grants broad general powers. Will the Act foster trust in the system that it sets up? 
The question of trust
David Anderson’s original Review was framed as “A Question of Trust”. Although we may believe a system to be operated by dedicated public servants of goodwill and integrity, nevertheless for the sceptic the answer to the question of trust posed by intrusive state powers is found in a version of the precautionary principle: the price of liberty is eternal vigilance.
Whoever may have coined that phrase, the slavery abolitionist Wendell Phillips in 1852 emphasised that it concerns the people at large as well as institutions:
“Eternal vigilance is the price of liberty; … Only by continued oversight can the democrat in office be prevented from hardening into a despot; only by unintermitted agitation can a people be sufficiently awake to principle not to let liberty be smothered in material prosperity.”
Even those less inclined to scepticism may think that a system of broad, general powers and soft limits merits a less generous presumption of trust than specifically limited, concretely defined powers. 
Either way a heavy burden is placed on oversight bodies to ensure openness and transparency. To quote A Question of Trust: “…trust depends on verification rather than reputation, …”. 
One specific point deserves highlighting: the effectiveness of the 5 year review provided for by the IP Act will depend upon sufficient information about the operation of the Act being available for evaluation.
Hidden legal interpretations
Transparency brings us to the question of hidden legal interpretations. The Act leaves it up to the new oversight body whether or not proactively to seek out and publish material legal interpretations on the basis of which powers are exercised or asserted
That this can be done is evident from the 2014 Report of Sir Mark Waller, the Intelligence Services Commissioner, in which he discusses whether there is a legal basis for thematic property interference warrants. That, however, is a beacon in the darkness. Several controversial legal interpretations were hidden until the aftermath of Snowden forced them into public light. 
David Anderson QC in his post-Act reflections has highlighted this as a “jury is out” point, emphasising that “the government must publicise (or the new Commission must prise out of it)” its internal interpretations of technical or controversial concepts in the new legislation. In A Question of Trust he had recommended that public authorities should consider how they could better inform Parliament and the public about how they interpret powers.
Realistically we cannot safely rely on government to do it. The Act includes a raft of new secrecy provisions behind which legal interpretations of matters such as who applies end to end encryption (the service provider or the user), the meaning of ‘internet communications service’, the dividing line between content and secondary data and other contentious points could remain hidden from public view. It will be interesting to see whether the future Investigatory Powers Commission will make a public commitment to implement the proposal.
Vulnerability to legal challenge
In the result the Act is long on safeguards but short on limits to powers. This structure looks increasingly likely to run into legal problems. 
Take the bulk interception warrant-issuing power. It encompasses a variety of differing techniques. They range from real-time application of 'strong selectors' at the point of interception (akin to multiple simultaneous targeted interception), through to pure ‘target discovery’: pattern analysis and anomaly detection designed to detect suspicious behaviour, perhaps in the future using machine learning and predictive analytics. Between the two ends of the spectrum are seeded analysis techniques, applied to current and historic bulk data, where the starting point for the investigation is an item of information associated with known or suspected wrongdoing.
The Act makes no differentiation between these different techniques. It is framed at an altogether higher level: necessity for general purposes (national security, alone or in conjunction with serious crime or UK economic well-being), proportionality and the like.
Statutory bulk powers could be differentiated and limited. For instance distinctions could be made between seeded and unseeded data mining. If pattern recognition and anomaly detection is valuable for detecting computerised cyber attacks, legislation could specify its use for that purpose and restrict others. Such limitations could prevent it being used for attempting to detect and predict suspicious behaviour in the general population, Minority Report-style. 
The lack of any such differentiation or limitation in relation to specific kinds of bulk technique renders the Act potentially vulnerable to future human rights challenges. Human rights courts are already suggesting that if bulk collection is not inherently repugnant, then at least the powers that enable it must be limited and differentiated.
Thus in Schrems the CJEU (echoing similar comments in Digital Rights Ireland at [57]) said:
“…legislation is not limited to what is strictly necessary where it authorises, on a generalised basis, storage … without any differentiation, limitation or exception being made in the light of the objective pursued.” (emphasis added)
The same principles are elaborated in the CJEU’s recent Watson/Tele2 judgment, criticising mandatory bulk communication data retention:
“It is comprehensive in that it affects all persons using electronic communication services, even though those persons are not, even indirectly, in a situation that is liable to give rise to criminal proceedings. It therefore applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious criminal offences. Further, it does not provide for any exception, and consequently it applies even to persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy ….
106 Such legislation does not require there to be any relationship between the data which must be retained and a threat to public security. In particular, it is not restricted to retention in relation to (i) data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved, in one way or another, in a serious crime, or (ii) persons who could, for other reasons, contribute, through their data being retained, to fighting crime …” (emphasis added)
The CJEU is also due to rule on the proposed agreement between the EU and Canada over sharing of Passenger Names Records (PNR data). The particular interest of the PNR case is that the techniques intended to be applied to bulk PNR data are similar to the kind of generalised target discovery techniques that could be applied to bulk data obtained under the IP Act powers. As described by Advocate General Mengozzi in his Opinion of 8 September 2016 this involves cross-checking PNR data with scenarios or profile types of persons at risk:
“… the actual interest of PNR schemes … is specifically to guarantee the bulk transfer of data that will allow the competent authorities to identify, with the assistance of automated processing and scenario tools or predetermined assessment criteria, individuals not known to the law enforcement services who may nonetheless present an ‘interest’ or a risk to public security and who are therefore liable to be subjected subsequently to more thorough individual checks.”
AG Mengozzi recommends that the Agreement must (among other things):
- set out clear and precise categories of data to be collected (and exclude sensitive data)
- include an exhaustive list of offences that would entitled the authorities to process PNR data
- in order to minimise ‘false positives’ generated by automated processing, contain principles and explicit rules:
  • concerning scenarios, predetermined assessment criteria and databases with which PNR would be compared, which must
  • to a large extent make it possible to arrive at results targeting individuals who might be under a reasonable suspicion of participating in terrorism or serious transnational crime, and which must
  • not be based on an individual’s racial or ethnic origin, his political opinions, his religion or philosophical beliefs, his membership of a trade union, his health or his sexual orientation.
As bulk powers come under greater scrutiny it seems likely that questions of limitation and differentiation of powers will come more strongly to the fore. The IP Act’s philosophy of broad powers counterbalanced with safeguards and soft limits may have produced legislation too generalised in scope and reach to pass muster.

Success in getting broad generally framed powers onto the statute book, though it may please the government in the short term, may be storing up future problems in the courts. One wonders whether, in a few years’ time, the government will come to regret not having fashioned a more specifically limited and differentiated set of powers.

[Amended 31 December 2016 to make clear that not all of RIPA is replaced.]

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

David Anderson Q.C.’s Bulk Powers Review made only one formal recommendation (a Technical Advisory Panel to assist the proposed Investigatory Powers Commission).

However the report drops a tantalising hint of the debate that might have taken place if the Review had been commissioned before the Bill started its passage through Parliament instead of almost at the end.

At [9.17] Anderson says:

“I have reflected on whether there might be scope for recommending the “trimming” of some of the bulk powers, for example by describing types of conduct that should never be authorised, or by seeking to limit the downstream use that may be made of collected material. 
But particularly at this late stage of the parliamentary process, I have not thought it appropriate to start down that path.  Technology and terminology will inevitably change faster than the ability of legislators to keep up.  The scheme of the Bill, which it is not my business to disrupt, is of broad future-proofed powers, detailed codes of practice and strong and vigorous safeguards.  If the new law is to have any hope of accommodating the evolution of technology over the next 10 or 15 years, it needs to avoid the trap of an excessively prescriptive and technically-defined approach.”
Let us put aside whether it is sensible or appropriate to try to future-proof powers – my view is that to do so repeats the error of RIPA – and then put aside the debate about whether bulk powers should exist at all. How might one go about a task of trimming bulk powers? What types of conduct might be candidates for never being authorised? What sort of limits on downstream use might be desirable and feasible?

The Report illustrates, perhaps more clearly than before, the very wide range of techniques that are brought to bear on bulk data (whether sourced from interception, equipment interference, bulk communications data acquisition or Bulk Personal Datasets). They range from real-time application of 'strong selectors' at the point of interception (akin to multiple simultaneous targeted interception), through to generalised pattern analysis and anomaly detection (utilised by MI6 on Bulk Personal Datasets in Case Study A11/2) designed to detect suspicious behaviour, perhaps in the future using machine learning and predictive analytics.

Pattern analysis is similar to data mining techniques described in A Question of Trust (AQOT):
"14.43. It is sometimes assumed that GCHQ employs automated data mining algorithms to detect target behaviour, as is often proposed in academic literature. That, it would say, is realistic for tasks such as financial fraud detection, but not for intelligence analysis."
AQOT included possible future developments of such techniques as one of several examples of capabilities that, at least cumulatively, would go beyond Bentham's Panopticon:

"13.19(d) A constant feed of data from vehicles, domestic appliances and healthmonitoring personal devices would enable the Government to identify suspicious (or life-threatening) patterns of behaviour, and take pre-emptive action to warn of risks and protect against them."
AQOT commented on those examples:

"13.20 Much of this is technically possible, or plausible. The impact of such powers on the innocent could be mitigated by the usual apparatus of safeguards, regulators and Codes of Practice. But a country constructed on such a basis would surely be intolerable to many of its inhabitants. A state that enjoyed all those powers would be truly totalitarian, even if the authorities had the best interests of its people at heart.
13.21. There would be practical risks: not least, maintaining the security of such vast quantities of data. But the crucial objection is that of principle. Such a society would have gone beyond Bentham’s Panopticon…"
Between the two ends of the spectrum are seeded analysis techniques, applied to current and historic bulk data. AQOT again:

"Much of [GCHQ's] work involves analysis based on a fragment of information which forms the crucial lead, or seed, for further work. GCHQ’s tradecraft lies in the application of lead-specific analysis to bring together potentially relevant data from diverse data stores in order to prove or disprove a theory or hypothesis. As illustrated by the case study on GCHQ’s website, significant analysis of data may be required before any actual name can be identified. This tradecraft requires very high volumes of queries to be run against communications data as results are dynamically tested, refined and further refined. GCHQ runs several thousand such communications data queries every day. One of the benefits of this targeted approach to data mining is that individuals who are innocent or peripheral to an investigation are never looked at, minimising the need for intrusion into their communications."
A similar explanation of seeded analysis of bulk data was given by Lord Evans in evidence to the Commons Public Bill Committee 24 March 2016. 

A "strong selectors" technique whereby the full catch from a transmission is stored for only a few seconds for processing before being discarded may rate relatively low on the Orwell scale.  Seeded analysis rates fairly high, since it relies on bulk data (albeit filtered to some degree) being stored for later querying. Unseeded pattern analysis and anomaly detection is off the scale.  It is closest to the characterisation by M. Delmas-Marty, a French lawyer quoted in the Review report: "Instead of starting from the target to find the data, one starts with the data to find the target."  
As it stands the Bill's bulk powers regime would empower all these techniques with no distinction between them, leaving it to the judgement of the Secretary of State, the Judicial Commissioners and after the event oversight to regulate and possibly limit their use under principles of necessity and proportionality.

An informed debate about trimming bulk powers could entail discussion of whether unseeded pattern analysis and anomaly detection should be permitted, and if so whether only for very specific and limited purposes.  It could also look at whether specific rules should govern seeded analysis.  It might also consider whether individual sets of "strong selectors" should require separate warrants, by analogy with non-thematic targeted interception warrants. Regrettably, in part due to the late stage at which the Bulk Powers Review has taken place, very little such nuanced debate has taken place.
Trim in the Bill, not Codes of Practice
Limitations on the scope of powers belong in the Bill and should not be left to Codes of Practice.

Although the government often states that the Codes of Practice 'have statutory force' (see e.g. Letter from Lord Keen to Lord Rooker, 8 July 2016, they do not have the same force as a statute. Their status and effect are limited to that set out in Schedule 7 para 6 (which possibly confers on Codes of Practice a weaker general interpretative role than does RIPA s.72).
Trimming approaches
Different kinds of analytical techniques apart, possible approaches to trimming bulk powers can be considered by reference to different facets of the powers.  I give some illustrative examples below, not necessarily to advocate them but more as an aid to understanding.
A.    Purposes
The Bill as currently drafted applies three cumulative sets of purposes to the interception and equipment interference bulk powers:
1.       The statutory purposes (national security etc).  Some have called for national security to be defined.
2.      Operational purposes. A new government amendment in response to a suggestion from the Intelligence and Security Committee provides that a list of purposes approved by the Secretary of State must be maintained by the heads of the intelligence services. The Secretary of State must be satisfied that an operational purpose to be included in the list is specified in a greater level of detail than the statutory purposes.
3.      Overseas-related purpose. The Bulk Powers Operational Case places considerable weight on the fact that the bulk interception and equipment interference powers are overseas-related.  Thus BI is described at 7.1 as a 'capability designed to obtain foreign-focused intelligence'. Similarly BEI is described at 8.2 as 'foreign-focused'. However:
a.      Obtaining 'overseas-related' data need only be the main, not the sole, purpose of the warrant.
b.      Overseas-related communications include those in which the individual overseas is communicating with someone (or something) in the UK.
c.       The 'overseas-related' limitation on purpose is exhausted once the information has been acquired by means of the bulk interception or interference (see the comments on RIPA S.16 in the Liberty IPT case, para 101 et seq. The Bill is structured in a similar way.)
d.     As the Operational Case acknowledges, non-overseas-related communications and information (and associated secondary data and equipment data) may be incidentally acquired. While the Operational Case attempts to downplay the significance of this, it provides no evidence on which to conclude that collateral acquisition may not be on a substantial scale.
e.      There is no obligation to discard, or attempt to discard, or discard upon gaining awareness of its presence, non-overseas-related material acquired in this way.
f.        The need to obtain a targeted examination warrant in relation to persons within the British Islands applies only to content, not to secondary data or equipment data.
g.      Secondary data and equipment data will under the Bill include some material extracted from content that under RIPA would be regarded as content. The expanded categories appear to go wider than what might intuitively be thought of as communications data (see Section F below).
h.      The purposes for which the Operational Case contemplates that secondary data and equipment data may be analysed go far beyond the limited purpose of ascertaining the location of a person ventilated in the Liberty IPT case (see Section G below).
Some possible approaches to trimming:
(1)   Limit the downstream use that can be made of collected material (whether content or secondary data/equipment data) to match the overseas-related main purpose for which it can be collected.
(2)  An obligation to seek out and remove, or remove upon gaining awareness of its presence, non-overseas-related material.
(3)  Raise the location threshold, so that a British Islands resident does not automatically lose content protection merely by venturing half-way across the English Channel (cf Keir Starmer, Commons Committee, 12 April 2016 at col. 116)].

B.    Types of data and communication
With one exception the bulk powers in the Bill make no distinction between types of communication. They range from human to human messaging of various types through to automated communications and single-user activities such as browsing websites.
The one exception arises from the definition of overseas-related communications, applicable to interception and equipment interference bulk powers: communications sent by or received by individuals who are outside the British Islands. 
This would include an e-mail sent by an individual within the British Islands to an individual outside the British Islands and vice versa. It would exclude a search request sent by an individual within the British Islands to an overseas server (since there is a server, not an individual, at the other end). But it would include a search request sent by an individual outside the British Islands to a UK server.
The significance of this exclusion is, however, reduced by the ‘by-catch’ provisions.  Unless the agencies are able to filter out excluded material at the point of collection then, as with RIPA, it is collectable as a necessary incident and falls into the general pool of selectable data.
The Bill contains no indication of when a communication is to be regarded as sent by or received by an individual. An e-mail or text message addressed to an individual clearly is so. What about an e-mail addressed to, or sent by, a corporate account? What about machine-generated e-mails? When is a communication generated by or sent to an individual’s device without the knowledge of the individual to be regarded as sent or received by the individual? Background smartphone communications are an obvious example. What if a car, without the owner/driver/passenger’s knowledge, automatically generates and sends an e-mail requesting a service or an emergency message, including associated location data?
Some possible approaches to trimming:
(1)   Limit the extent to which background and machine generated communications may be regarded as sent or received by an individual.
(2)  An obligation as in B(2) above to remove non-overseas-related material would imply an obligation to remove kinds of overseas communication not sent or received by an individual.
(3)  Should powers apply to all types of communication, or only human to human messaging?
C.    Types of conduct authorised
Some possible approaches to trimming:
(1)   Limit scope by reference to concrete types of conduct that can (or specifically cannot) be authorised. The Centre for Democracy and Technology submission to draft Bill Joint Committee at [42], repeated in CDT evidence to the Public Bill Committee at [20] to [25], suggested this kind of approach for equipment interference warrants in relation to the possibility of mandating encryption back doors.
D.   Use of incidentally collected data
As discussed in my evidence to the Joint Committee ([117]to [137]) and above in relation to overseas-related communications there is a fundamental issue concerning the extent to which domestic content and secondary data collected as a by-product of the overseas-related bulk powers can be used in non-overseas-related ways.
Some possible approaches to trimming:
(1)   As above (B(1)).

E.    Extent of secondary data and equipment data
The Bill embodies a significant shift (compared with RIPA) towards classifying various types of content as secondary data or equipment data (see my blog post).  The Bill appears to go further than extracting communications traffic data (e-mail addresses and the like) from the body of a communication such as an e-mail. It appears to include the ‘who where and when’ not just of communications, but of people’s real world activities per se
Some possible approaches to trimming:
(1)   Limit extracted metadata to true communications data (i.e. data about communications).
F.     Types of use of bulk secondary, equipment and communications data
Various uses of bulk metadata have been ventilated. The Bulk Powers Review contains numerous examples. They types of use can differ significantly from each other. For instance:
-         To determine whether the sender or recipient of a communication is within or outside the British Islands (the very limited purpose advanced by the government in the LibertyIPT case – see my evidence to the Joint Committee at [128] to [130])
-         To have visibility of a full historic record so that authorities can go back and find out after the event about a malefactor’s communications and online activities
-         Seeded analysis to find a target’s associates or more about a target’s identity (as discussed above)
-         Target discovery based on patterns of behaviour, as discussed above (see also Operational Case [3.3] and [3.6]).
These various uses have different implications for the rationale for collecting data in bulk. At one end of the spectrum bulk collection is seen as a necessary evil, required only because for technical reasons (e.g. fragmentation of packets or presence of the target in other countries) target communications cannot be separated at point of collection from the rest. That may hold out the prospect that as technology improves it becomes possible to carry out more targeted bulk collection, particularly as real time capabilities increase. 
At the other end of the spectrum (pattern detection and predictive analysis) bulk collection is can become more of an end in itself: amassing data so as to provide the most accurate ‘normal’ baseline against which ‘suspicious’ behaviour patterns can be detected. This appears to carry no prospect of reducing the quantity of metadata collected – probably the opposite.
The Bill is almost completely devoid of concrete limitations on, or distinctions between, the types of use that can be made of bulk metadata. The limits are the statutory purposes, operational purposes and necessity and proportionality. The Bulk Powers Review proposes a Technical Advisory Panel to assist the Investigatory Powers Commission in keeping technological developments under review.
Some possible approaches to trimming:
Limitations on use could be based on e.g.
(1)   the justification provided to the IPT in Liberty;
(2)  specific seeded analysis versus more generalised pattern detection
(3)  limitations on numbers of hops when following possible associations (Twitter followers, Facebook friends etc)
(4)  applying the non-British Islands examination restriction to metadata searches (note Operational Case paras 5.14 to 5.19).
G.    Types and location of conduct authorised by warrants
The bulk warrantry system seems to allow for three possibilities:
(1)   Unilateral conduct by the intercepting or equipment interfering agency without the knowledge or assistance of the CSP
(2)  Assisted conduct under a warrant supported by a technical capability notice
(3)  Assisted conduct under a warrant without the support of a technical capability notice
The Bill does not specify any specific circumstances in which these different approaches are or are not appropriate (other than technical capability notices for equipment interference limited under Clause 228(10)/(11) to UK CSPs). Nor are the different approaches addressed in the Operational Case. Similarly AQOT:
"Implementing a s8(1) warrant generally relies on the cooperation of service providers, acting typically in response to a direction from the Government under RIPA s12. A copy of the intercepted communication is passed by the companies to the intercepting agencies who examine it using their own staff and facilities. External communications may be obtained under a s8(4) warrant either directly by GCHQ, using its own capabilities, or through a service provider." (emphasis added)
Some possible approaches to trimming:
(1)   Limitations (perhaps territorial) on unilateral conduct under bulk warrants.
(2)  Special thresholds for the use of (say) bulk equipment interference warrants.
(3)  Limits on what a technical capability notice can require.

H.   Intermediate stages
Bulk interception and use of its product may take place in several stages, such as: collection, culling (discard of unwanted types of data), filtering (use of positive selectors), storage for subsequent querying by analysts.  Whether these techniques are typically applied to secondary data to the same extent as to content is unclear.
The Bill says nothing detailed about the culling and filtering stages, other than restrictions by reference to someone's location within the British Islands on selection of content for examination.
Some possible approaches to trimming:
(1)   Specific obligation to apply data minimisation techniques at intermediate stages, applicable to both content and metadata
(2)  Specific provisions controlling culling and selector types (for instance requiring individual warrants for "strong selectors")
I.      Real time versus periodic
Is the bulk communications data acquisition power meant to be one that should be exercised occasionally when specific circumstances justify it, or can it be exercised routinely? If the latter, could it be used as a near-real time or quasi-real time feed?

A one-off data dump in exceptional circumstances is a rather different animal from a near real-time tool. In this context the recent IOCCO report speaks of ‘regular feeds’ acquired under S.94 Communications Act 1984.   The Bill appears to cover both possibilities.
Some possible approaches to trimming:
(1)   Specially justified occasions versus frequent routine feeds.

J.     Interaction with communications data retention
The bulk communications data acquisition power is closely linked to the communications data retention power.  The more broadly the data retention power is exercised, the greater the range of datatypes that will be available to be acquired in bulk.
It is significant in this context to recall that the data retention power (a) goes far wider than the internet connection records that the government has so far discussed and budgeted for in its Impact Assessment; and (b) unlike DRIPA, can be used to require relevant communications data to be generated or obtained, not merely retained. 
Some possible approaches to trimming:
(1)   Limit bulk acquisition power to concretely specified types of communications data; and/or
(2)  Require specified public consultation and procedures if any extension of compelled retention or acquisition is contemplated.
K.    Types of operator
The Bill significantly extends the classes of operator to which the various powers can be applied.  The table below compares the powers in current legislation (mainly RIPA, but bearing in mind the extension effected by DRIPA) with those in the Bill.

Compliance and assistance obligations expressly applicable to private operators are highlighted in green.  "Telecommunications operator" under the Bill definition at Clause 233(10) includes private networks (and 'service' is not restricted to a commercial service).  

Read Full Article
Visit website

Read for later

Articles marked as Favorite are saved for later viewing.
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free year
Free Preview