This talk is the fourth talk in a series about "The impact of the GDPR in the Higher Education sector".
General Data Protection Regulation (GDPR) is an European Union (EU) regulation intended to strengthen and unify data protection for all individuals within the EU. The enforcement date is 25th May 2018, at which time organisations in non-compliance will face heavy fines.
What will that mean for Higher Education Institutes (HEI) and other public institutions that need to deal with students' and prospects' personal data on a daily basis?
These and other questions were the focus of an event organised by FULL FABRIC, where a series of subject matter experts explain the impacts and the opportunities created by this regulation.
This is the fourth video in a series of six, with David Erdos, a lecturer in Law and the Open Society at the Faculty of Law in the University of Cambridge, and his research focus on data protection and the interface between data protection and freedom of expression.
This video was filmed at FULL FABRIC's conference The Impact of the GDPR in Higher Education. The event was held at Imperial College London on 22 June 2017 as part of London EdTech Week.
Video Transcript
00:02 David Erdos: Okay. Well, thank you very much for lasting this long in this humid and hot day. I'm David Erdos, I'm a lecturer in Law and the Open Society at the Faculty of Law in the University of Cambridge, and I suppose what I mainly research on is actually data protection, and particularly the interface between data protection and freedom of expression. And I would actually include a lot of research in freedom of expression, because it's very much about the free flow of information, the generation of public knowledge and public understanding. It's actually about fairly important and socially interesting matters. And I suppose it will be a slightly shift of gear compared to possibly the earlier sessions, possibly in a number of senses, I am an academic, and I am an active researcher, so apologies in advance. There will be quite a lot of law. But there'll also be quite a lot of specialist law, because I think probably what you've learnt today has been a lot about the default data protection provisions, whereas what I want to emphasise to you is there's an awful law which is specialised to look at, in getting research, regulation and governance right within your institutions.
01:29 DE: And further, to make the issue more complicated, the GDPR makes many changes here. I think most institutions possibly, data protection hasn't been given the prominence it needs to, and research has been sort of bolted on. It's been bolted on often by looking just at the default, the default rules. And if any specialist rules have been looked at, oh, there's some ones which are badged research, you think of section 33, for example, of the Data Protection Act. Well, going forward, at least in social and humanities research, that's no longer going to be the primary clause to look at. Academic expression is protected as an instance of freedom of expression. And that is designed to be the primary way in which you reconcile these two values in social and humanities research. In biomedical and especially safeguarded research, yes, you primarily look at the research provisions, but those provisions have been designed also with purely economic processing in mind, the websites, the huge big data companies online for which for their purely economic interests, try and manipulate data for generating profit. There are also general derogations, which are for public interest purposes and where right and freedom is involved, which researchers, as I would argue, in a publicly interested academic setting can use, but they're not available for purely economic.
03:08 DE: So, in fact, we need to look at three different special provisions. That's the introduction. I'm assuming you've... I thought I'd just very quickly start with what in fact is the default in data protection, I'm gonna go over this incredibly quickly because I'm sure you've already covered it a lot today. But it's almost impossible to over-emphasise the breadth of these rules and these provisions. Personal information, according to the Information Commissioner's Office, is anything about an identified or even identifiable individual. Their guide on journalism and data protection even says, the job title of a public official, elected public official, perhaps, in the public domain is personal data. Logically, that would point to mentioning Theresa May, is she the Prime Minister today? I believe she still is, is her personal data. It's really quite esoteric in a way, because it's not, I think, going back to Norris' point, necessarily how institutions themselves think of personal data. They think of it as confidential data, which is a small part potentially, of what researchers handle in their projects. And processing, of course, is anything you do with data, even bringing it up and looking at it, even storing it, even deleting it, actually, is processing and it's regulated at least if it's on a digital device, and that's pretty much everything.
04:45 DE: What the GDPR does, is officially, it gives very broad duties to anyone who is processing personal data, and we've got a beefed-up notion of the data protection principles, we've got a need for the legitimating grounds, and I'm very pleased to hear that the ICO is now saying legitimate interests should be available for universities. I think that's right, because special powers, as in Foster and British Gas and the European Court of Justice's understanding of what a public authority is, wouldn't be a public authority in most cases for a university. So very pleased about those developments. And then you have rules, rules on sensitive data, which are extremely onerous, they essentially say that unless you have explicit, freely given, rescindable consent, or the data subject is manifestly putting it into the public domain themselves, you can't process it, that's the default.
05:42 DE: Transparency and control, very extensive provisions about making data transparent to the data subject and a whole panoply of rights to the data subject to give them some control over that processing if they wish. And I hardly need to go through all the discipline and supervisory provisions, but suffice to say these are also very significantly made more prescriptive in the GDPR than in the existing framework. That's really where the issues come in. How on earth do you fit this into the free flow, the messy, the ubiquitous, the decentralised free flow of information that is the pith and substance particularly of social and humanities research?
06:32 DE: If you look at the sensitive data provisions, they in many ways go through categories, which are the most socially interesting things: Political opinion, religious belief, things to do with criminality and crime and deviance. This is what most social scientists and humanities people research. It's the socially interesting thing. It's also what most journalists write and publicise about. If you have to go and tell every individual you're researching on what you're up to, where does that leave covert or deceptive research, research on police racism or research into far right groups? It would make a mockery of any attempt to do this kind of research.
07:17 DE: And that's really where the protection for academic expression comes in, because literary expression, artistic expression and journalistic expression, already clearly protected in section 32 of the Data Protection Act. But it seems rather odd if you follow the logic of saying, "Oh, no, no, you're a researcher." It's all to do with the research provisions at most to say that a careful, ethical analysis of, say a contemporary historical issue done by an academic can't be causing damage or distress when you're using sensitive data or something like that.
07:54 DE: But of course, if you're a rag tabloid involved in infotainment about a non-public figure, yeah, you can use section 32. That makes no sense. I don't... In principle it makes no sense. And practically trying to fit social and humanities research into this kind of strait jacket, also makes at a practical level little sense. The new provision, and I should declare not exactly an interest but an advocacy role along with the ESRC, the Economic and Social Research Council, the Wellcome Trust, and a number of others, I very much advocated for this solution and was very pleased to see, and it was very much a result of UK advocacy that this provision at a Pan-European level was introduced. Academic expression is now included along artistic, literary and journalistic purposes as an instance of freedom of expression.
08:50 DE: Now, this doesn't mean that anything goes, that social and humanities researchers can do anything they like with data because it's their right to freedom of expression. It is meant to be a balance between the default, but also the extremely strong default, that people have a right to disseminate public knowledge, public understanding. And not just disseminate, but also go through all the steps preparatory which are necessary to that, collecting the data, analysing the data, and of course, receiving the data. Because it's not just imparting information, it's also the right to freedom of expression includes the right to receive. So, social and humanities research should be protected going forward and, of course, I should say we don't know how the UK Government and the UK Parliament are going to implement GDPR in terms of these derogations. We found out a little bit yesterday that it was likely to be primary legislation, but in terms of the substance, we're slightly in the dark. And I'm making an assumption here that it's going to look quite like section 32.
09:55 DE: If it doesn't, because of Leveson and things like that, it will still be in the same ballpark. Section 32 says, "You look to a reasonable belief in the public interest that the publication is in the public interest." And that would be the reasonable belief, I think, in an institution that believes in academic freedom of the researcher. The researcher has a reasonable belief that this is in the public interest, and that they have a reasonable belief that it actually would frustrate or is incompatible with that expressive purpose to follow, say, the sensitive personal data ban, the transparency ban, etcetera, etcetera. That is the basic model of reconciling freedom of expression with data protection.
10:39 DE: Okay, but what about... I'm not trying to say that there's no research which wouldn't really realistically fit that model. Clearly, much research depends on specially safeguarding data beyond what we would see in a kind of a balancing exercise like that. But, the obvious example would be biomedical research, but there are other examples, including potentially in some records-based social science. So, yes, there you first [11:13] ____ in section 33 currently of the Data Protection Act, look first to not article 85 on freedom of expression, but article 89 on research, what is available as derogations for research. Again, this depends on national implementation. We have to make some assumptions.
11:35 DE: Now, this is a traffic light version of the first slide, which I showed you. What's in green is mandatory to provide derogations in this area. What's in amber, it hasn't come out quite as well, is optional and actually there should also be a middle yellow, which is a very strong hint this needs to be available but not quite mandatory. And red, of course, is there's no derogation here. And I think the thing that strikes me most when I did this graph is just how much is in red. Almost all the data protection principles, almost all the transparency rules, all of the discipline provisions, all of the supervisory provisions. The exceptions are the repurposing of data is in principle okay, and you can hold it for longer than you otherwise would in terms of for a research purpose, optional opt-out from subject access. Control rights, there are some opt outs. I'm not gonna go through it, because it's too dense, probably. But there are some opt outs, but they're not complete.
12:45 DE: Sensitive data, for a very long time it looked like the European legislator would make a compulsory provision for processing sensitive data without consent for a research purpose because of all the issues that's created around Europe, with some countries not having any vires at all and others being rather unclear. At the very last minute, really, it was added in under Member State or Union Law or something like that, which again means that it's up to member states to make that law and if they don't, probably it's not available or it might not be available.
13:21 DE: I don't think this is terribly important, because we can assume that the UK will make use of these provisions. They map on quite strongly to the existing section 33 and allied provisions, but the GDPR lays out what the requirements need to be, and they are not too problematic, these. I think the biggest problem is what is in red, what's not possible to derogate from. But they talk about appropriate safeguards for data minimisation; in particular, needing the derogations to be necessary and in sensitive data specific measures to safeguard fundamental rights. Now, again, we wait to see what the government will do about that. Currently, the derogations depend on non-particularity, not making measures or decisions in relation to a particular data subject when you're processing with these derogations. Non-malfeasance, not causing substantial damage or substantial distress to the data subject and, in relation to subject access only, not publishing material which can identify an actual person.
14:30 DE: But I have to say, these derogations conditions haven't worked well. Take biomedical research. Much of biomedical research, I would say increasingly with genetic research and things like this, do inevitably produce results, which may be of interest to the data subject. And the idea that you have to compartmentalise that and never go back to the data subject even when it's in their benefit, because that would result in a measure or a decision being taken in relation to a person, is absurd, but it's caused an awful lot of angst and an awful lot of problems. So I've actually been advocating with a number of organisations that we drop some of these pre-emptory rules in relation to the derogations, because they haven't worked well, but we'll... And some countries don't have them, like Germany. We'll wait to see what the government do.
15:20 DE: But the biggest problem, as I say, is how much is in red. And again, I go back to the reason for that, I think is because the prime target of why the European legislator wouldn't go further isn't higher education, publicly interested research. It's the fact the danger of these provisions being abused by huge big data companies online mining people's personal information to make millions of pounds for their purely private economic interest. And that probably relates to why we don't see any derogation from the transparency provisions, because it's seen, if someone's making a buck off your data, why on earth shouldn't you at least know about it. But this will cause potential problems for records-based research, because, going back to the definitions, processing is anything you do with data, and the direct collection of data triggers a pretty much indefeasible right for the data subject to be informed of a whole variety of things, including the purpose for which the data has been obtained. And if there is a repurposing of that data, it does appear to explicitly say that you must go back to the data subject to tell them about that, irrespective of whether the repurposing is compatible or incompatible.
16:47 DE: And we know that in fact is always going to be with safeguards compatible. But that is a different thing from whether you have to tell them about it. And you can immediately see the problem, I hope, that an organisation thinking of supplying records even on safeguarded terms to a research team, may think, "Hang on a minute. That looks to me a bit like processing, the disclosure of data." Arguably, even the anonymisation of data is a processing operation performed on it, but certainly the disclosure of data would be. Why am I doing that? Am I doing it for the same purpose that it was collected, which was probably routine business purposes? No, I'm not. I'm doing it for publicly interested research purpose. And the individual may be uncontactable. It might... Although it's identifiable in the sense that the European legislator defines as identifiable, and we've been told that dynamic IP addresses are identifiable. It's, in a way, anyone's guess what that threshold is. But it's far, far, far lower than having the name, contact address, and email to get back in touch with people. And it's very likely that many of these old records which become relevant, say, to a social science project will not be trackable in that sense, or if they will be, it will be at great, disproportionate expense and possible data security issues.
18:08 DE: Going off and trying to find the contact details of people and getting it wrong is more dangerous, I think, than actually saying it's going to be safeguarded, it's not going to be used for any other purpose and it's for the public interest. And that's really where the possibility of using the general restrictions in article 23 of the Regulation come in, because these are not available for a pure economic interest, but they are available for important objectives of general public interest. And I would have thought that biomedical research is the pre-eminent example of that; saving lives, improving welfare. I don't think you can get more of an important objective of general public interest than that. It's also available, these general restrictions, for when a right and freedom is in play. And article 13 of the EU Charter, while we have it, talks of the right of the freedom of the sciences, also the arts, and of academic freedom. And I think the scientific knowledge and being involved in that endeavour is the exercise of a fundamental right in those circumstances.
19:23 DE: What I think you can see here, I won't go through the whole thing, but all of the transparency and control provisions, including all those transparency rules I talked about, can be derogated by member states on the basis of general public interest, or a right and freedom. And again, I've advocated that this is used, particularly in relation to records-based research, to provide legal certainty, because records-based research will continue, but I can see legal uncertainty, legal confusion coming, unless specific safeguards are put in place. It will require a legislative measure for the restriction and some specific provisions as to the purposes, which would be research purpose, or on archiving purpose, or whatever else, the categories of data, the scope of restrictions, the safeguards to prevent abuse, this is what the article says.
20:17 DE: Now, there are different opinions about how to implement that. Some would say that the statute itself needs to provide all of these specificities. That's quite a tall order, although secondary legislation can be quite labyrinthine. The Netherlands has gone for a very interesting... I don't think it's quite come into law yet, but it's the draft law, a very interesting solution, which is basically self-certification. A bit like how the data export regime works in the UK at the moment, which is self-assessment, they're proposing that these general restrictions should be usable by any data controller when they can show general public interest or a right and freedom, and then they would document, self-document all these protections. In many ways for a proper reconciling of rights and general interest, that has an appeal. Again, we wait to see what the government does, but I hope that they do something in this area.
21:23 DE: Okay, I hope there'll time for a bit of questions. Just a few highlighting conclusions. I hope I've shown that under the new law, unlike a lot of what you've probably heard today, social and humanities research is being liberalised in terms of its regulatory framework, in the interests of freedom of expression. And other scholarly research, including biomedical research, the EU legislator has seen this as an exception, as a special case, and has granted some very significant potential derogations. And I think this is, in some sense, a flipside to what Norris said, although proper management of data is always very important, but some of the internal policies of higher education institutions for researchers, I think, are pretty confusing, and read literally, extremely restrictive, and they don't really think through, there's actually another right and freedom here, there's a general public interest to consider, as a charity which is dedicated to advancing knowledge and understanding.
22:28 DE: And higher education institutions do have, particularly the pre-1992 ones under the Education Reform Act, statutory obligations to uphold academic freedom. And I think read alongside that in this framework, it does put an obligation on universities to relook at their internal data protection policies to make sure that they are finally properly balanced with the rights of researchers and the general interest of society in the generation of knowledge. And I put some... And I think that involves, by the way, as much about ethical review, and I'm here in a personal capacity, but I should say I'm a member of the University of Cambridge Research Ethics Committee.
23:12 DE: I think in terms of how these things are set up in universities, practically, in terms of research regulation, you need to be in touch with your research ethics committees, and make sure that they're aware of these changes and these liberalising provisions, and the need to consider freedom of expression, because that's where most regulation takes place of research. It also obviously relates to your information governance policies. But I think there does need to be a recognition of... Particularly in social and humanities research, of the reasonable judgment of the academic researcher themselves being given due weight. And even in other areas, that proportionality is sort of the lodestar here, in order to balance some things which are in quite considerable conflict. So, thank you very much, I'm very happy to answer any questions or...
24:07 Speaker 2: Okay. Thanks, David.
[applause]
24:09 S2: Thank you, David. In fact, my question that I had was similar to this one. So your work has previously looked at how this might impact arts or journalism. Is there any view on whether GDPR has more impact on the sciences or humanities, or is it a necessary headache for everyone? And I think, from your conclusion side there, you said that the humanities have a more privileged position, but to put that in a ranking order, which field has the least privileged position, if you were to identify the other end of this spectrum?
24:46 DE: Well, again, it's strange that we are now, I think, less... Well, we are definitely less than a year away from this new law coming in. But I have to say, we just don't know, because we haven't seen the government's... Member states, even in a regulation drawn up like this, have a large amount of discretion, and we wait to see what they do. But I think there's no question that biomedical research will not be considered an exercise of special expression, as in journalism. It will be processing sensitive personal data, private, confidential personal data. The idea that you can prove that the necessity is shown by not following the disciplinary provisions will be very, very hard. So you can expect that sector to be impacted with all sorts of requirements that we've gone through today about impact assessments, data protection by design, data security standards, data processor agreements. Even within that controlled framework, I hope that valuable research is able to continue. But I've said quite openly that we have a potential problem there around the disclosure of records.
26:02 DE: It's easier to get round the receipt of records, because that's indirect collection. You can rely on disproportionate effort and you need to have a privacy policy published on your site, essentially, or available to the public. But indirect collect... I mean, direct... The organisation still has to give you the data. And they're very likely, with a patient, or with a... Some, maybe, patients, there's a lot of work going on in the NHS to design systems so that notification takes place. And I'm not saying that's a bad idea either, but there will be organisations which haven't gone through that, have very valuable data to society, and cannot reasonably contact, or even securely contact, those people. But I would argue that research is very, very valuable, and so there could be a real legal headache there.
26:53 S2: Okay, so then that leads us onto the next question. So what is data ethics versus data privacy? And suppose there are people in the room, if they do get push back from a researcher who's unhappy with the decision, how can they escalate that challenge, or how can they manage that situation?
27:14 DE: You mean, how can the researcher manage, or...
27:16 S2: Or if they're handing it over to, for example, the DPO, a person responsible for the data, and they're negotiating that between one another, how is it...
27:26 DE: Well, obviously, if an organisation doesn't want to give data, it's not like the legislation requires them to give data. We do have, obviously, laws like Freedom of Information which do require data to be delivered, and we have the public reuse of certain public information. But on the whole, if an organisation doesn't, for whatever reason, it has a right to do that. I think... The first thing I'd say in that question is that data protection isn't just about protecting privacy, you all know this, I'm sure, but it's about protecting the whole fairness, the reputation of the individual, the concern that individuals are subject to all sorts of forces which they have very little control over, and have a big impact on their lives, and a need for a certain element of control. It's about all of that. And personally, I do think that the data protection framework should provide the main sort of lodestar for what is legitimate and illegitimate research.
28:32 DE: Gosh, sitting on the University Research Ethics Committee, I should have more to say on where ethics come in. I mean, my own personal opinion, again, and not necessarily the opinion of Cambridge institutionally, is that reasonable ethical disagreement is absolutely essential to the human condition. People do not agree on ethical judgments. There's a very wide area of reasonable disagreement, and therefore, I think if an institution is going to regulate ethics, it can only regulate, and it should very formally only regulate, up to that reasonableness standard. Is this researcher acting as an irresponsible and unreasonable researcher? And if the answer to that is no, then I think reasonable ethical disagreement is very important, because what do we mean by academic freedom if we don't mean the right to disagree and say, "Well, I have my academic freedom, and I am willing to defend that as ethical." Once it goes beyond being unreasonable and irresponsible, sure, the institution has a right to defend its reputation. But I think that is the test, unreasonable.
29:43 S2: Okay, final question. Under GDPR, you need to specify a timeframe to keep data records, does that apply in research?
29:52 DE: Gosh, now, this is... I did bring along the actual copy of the legislation, because I feel that this is a very, very technical question. There are an awful lot of discipline provisions, as I say, and I can't say I've gone through absolutely every one in the back of my mind in 10 seconds and there's nothing here, but on the whole, the answer is no. It's very, very clear that research data, subject to those appropriate safeguards, can be retained indefinitely, and therefore it shouldn't be necessary to put... Of course, it's very good if all people who are controlling data have a rough idea what data's there and why. But no, it isn't necessary to have data retention periods when data can be retained indefinitely. The data should have a relevance to that research, or that research purpose, or archiving purpose, but that, in a way, is a different issue. That's a data quality issue. Should you need an actual retention period? No, I wouldn't say so.
30:57 S2: Okay, are there any final questions from the audience? No, okay, excellent. Well, thank you very much, David. Give you another round of applause.
[applause]
31:10 S2: So that brings us to the end of today's event. I just want to say a huge thank you to all the speakers, who prepared and put all the energy into their presentations today, to everyone for coming along and asking so many fantastic questions, and, of course, to Full Fabric for organising everything and putting everything on today. If you're interested in accessing the presentations, those will be available. So there should be an option on [31:34] ____ to download them directly. And I think, just in terms of GDPR, there's been lots of fantastic takeaways today. So do go and follow all of the speakers on Twitter, go and find them on LinkedIn, and hopefully that's given you some guidance to be getting on with. So I think that's pretty much everything. I think we're going to just join outside for some extra networking, for those who want to continue to build their support networks, and otherwise after that, do have a safe journey home, and thank you very much for coming.
The development and maintenance of an in-house system is a complex and time-consuming task. Full Fabric lets you turn your full attention to maximizing growth and performance.