This talk is the fifth talk in a series about "The impact of the GDPR in the Higher Education sector". You can view the other talks by following this link.
General Data Protection Regulation (GDPR) is an European Union (EU) regulation intended to strengthen and unify data protection for all individuals within the EU. The enforcement date is 25th May 2018, at which time organisations in non-compliance will face heavy fines.
What will that mean for Higher Education Institutes (HEI) and other public institutions that need to deal with students' and prospects' personal data on a daily basis?
These and other questions were the focus of an event organised by FULL FABRIC, where a series of subject matter experts explain the impacts and the opportunities created by this regulation.
This is the fifth video in a series of six, with Gerrit-Jan Zwenne, Professor of Law & Digital Technologies at Leiden University. He discusses the specific rights individuals have under the General Data Protection Regulation.
This video was filmed at FULL FABRIC's conference The Impact of the GDPR in Higher Education. The event was held at Imperial College London on 22 June 2017 as part of London EdTech Week.
Video Transcript
00:02 Gerrit-Jan Zwenne: View of individuals' rights, data subject rights, and for me, the perspective is also that of a lawyer. And obviously, most lawyers working in data protection law work for companies or governments, for organisations, for the data controllers. So that would be... It's probably my perspective, and I might be a bit biased in that respect. You see here, my photo, why did I put that up there? Well, maybe you'll remember me later, invite me for another event, but it also has some meaning in terms of data protection, because this is ethnic data, racial data. You can see the color of my skin. And with respect to such data, we call those special data, special categories of data, very strict rules apply. So I notice that the individual's rights have already been touched upon in the previous presentations. No problem, because probably I'll give a different perspective, but I made this slide with five types of rights, and I have about 25 minutes to discuss these with you.
01:19 GZ: We'll start with data subject access and rectification rights, the right to be objected. These rights are not so new, not at all. They are more detailed. The GDPR provides... Asks for much more detail. Right now, if you get a data subject access request and no exemption applies, and it is about personal data, a lot of conditions there, it must be about personal data, that's the first thing. But if that's all the case, then you have to provide your own identity. Well, that's an easy one. The purposes for which you have collected the data, for which you processed the data; should not be too difficult, but practice shows that it often is difficult.
02:09 GZ: Not every organisation, not every data controller has a clear picture of that. This is going to be a real problem in terms of accountability. And all other information has to be provided that is necessary to safeguard the rights of that individual. But you have to realise with the GDPR, as in current data protection law, there are a lot of open concepts and vague norms that have to be filled in, made concrete in practice. So there's a lot of uncertainty. It's very helpful that data protection authorities provide guidelines, but it's always my advice to clients, and when I talk to students, that ultimately it's for the courts to decide. The Data Protection Authority has a very authoritative say in this, but they do not have the final word. Not at all. Of course, I say this because the other party for me is always the Data Protection Authority.
03:16 GZ: So what do you have to provide for the purposes of processing? That's what you already got. Data categories of the recipients, retention periods, that's new. So how long will you keep the data, you have to tell that. You have to put in the address of the Data Protection Authority, okay? That's an easy one. Data sources, how to complain, if you do automated decision-making, if profiling is part of the processing, you have to inform the data subjects about that. And if there are third country transfers, so transfers to countries outside the European Union, actually the European Economic Area. The UK will be a third country for us. I'm not sure how we're going to deal with that, 'cause you will comply with the GDPR, I understand, but you are a third country. Well, probably some kind of adequacy decision will be made there. We have to wait and see, but it has to be mentioned. So if you're using a provider in the United States, or maybe use OneDrive for Microsoft, or Workday or Salesforce, that's something you need to tell the data subject when he submits the data subject access request.
04:40 GZ: At the moment, I don't know how it's here, but in the Netherlands, a data controller can ask a very small fee to comply with such access request, 23 Euro cents per page. So most data controllers don't ask for that money, but sometimes, I have seen these cases in practice. There is a television show and there is something outrageous going on with a business information provider, or whatever, and then the television show puts a letter on the website with a data subject access request. So that company can get suddenly thousands of these requests. To be prepared, it might be helpful to say, "Well, could you first pay the 23 cents and then we'll provide the access." In the future, under the GDPR, this is not allowed any more. You cannot ask for money, only if it's a repeated request, but the one-time request, no money can be asked for that.
05:45 GZ: In my country, in the Netherlands, we had a discussion on what has to be provided. And our Supreme Court at one point said, "You can comply with the data subject access request by providing a copy of the original." Of course, this is probably about paper files, a copy of the original. So then you give all the context. For civil servants, this became a problem because they didn't want to provide all that information. So there was some litigation and it ended up in Luxembourg, the Court of Justice. And The Court of Justice said, "Well, the law says it needs to be a full overview, a complete overview. So you do not provide a copy of the file, but just a one page in which we have your name," and that's this, "We have your Social Security number," and that's this, "We have your date of birth, and that's it." In practice, this is really, really important, the difference between a copy and an overview. So what does the GDPR say? A copy of the personal data processed. I don't know what it means, but I think you still can... It's sufficient to provide a complete overview. So a one page with this, the data we got from you.
07:15 GZ: I once got a request from a client. He sent it through, it was an access request for my data subject. He said, "This is my IP number. I notice that you collect IP numbers. Could you give me all the information relating to that IP number?" So the dates [07:33] ____ or something. Is an IP address personal data? Because that's the first thing you have to answer, the first question to answer is, is it personal data? Because data subject access requests are about personal data. So is an IP address personal data? In the Netherlands, Data Protection Authorities, the Data Protection Authority says, "Yes, it is always, no problem." I don't think that's right. At the moment, I'm on 4G. So I get my IP address from EE, I don't know that provider. But maybe EE knows T-Mobile, which is my provider, and maybe they can connect, relate that IP address to me. However, if I were on Wi-Fi, public Wi-Fi here, there's no way somebody can identify me with that IP address.
08:35 GZ: So my view would be that is not personal data. However, if it's done by Vodafone or T-Mobile, then it's probably personal data. I know a lot of DPAs do not agree to this. There is some case law about this from the Court of Justice. So, I got this question from a person that says, "This is my IP address, give me my data." So what did we answer? Well, we answered, "We're not sure this is personal data. However, the first thing we need to get from you is some verification that you are the user of that IP address. Because if it is personal data, we cannot give you that unless we are absolutely certain that you are the data subject," and of course, then the discussion ended.
[laughter]
09:23 GZ: Okay, let's move on. Right to obtain restriction of processing. This is a new one, and basically it says when somebody, a student, makes a complaint, tells the university, "I do not agree that you process my data. You got the consent in the wrong manner, it's not valid, whatever. We contest that you process the data." And then this right says that the processing should be suspended. In my firm, we talked a whole hour about this right to restriction, and what it could mean in practice. And we actually wrote a letter to the ministry who was drafting the Acts that will implement certain parts that member states can implement, asking them, "Well, could you do something about this? Because this is going to be horrible. Somebody complains, and you have to stop processing to find out what's going on, and then you can continue processing or not." I really have no clue how to deal with this if it's not in some way... If there's not some other regulation derogation to this right. The GDPR allows for a derogation here, and member states can do that. So I really hope that your legislator is wise enough to do something about this.
11:01 GZ: Right to be forgotten. We already have a right to be forgotten. It's the famous Costeja court decision. Mr. Costeja went bankrupt in 1998 or something. And he had two apartments, houses that had to be sold in a public auction, and these were announced in a newspaper. And Spanish law required that a newspaper had to publish this, and also the newspaper needed to have this in the archives. So Mr. Costeja, he was a lawyer, he did not like this at all. It cost him business, because people started Googling him and they found, "Well, he's not so reliable, he went bankrupt in 1998." So the first thing he did, he asked the newspaper, "Could you remove this?" And the newspaper said, "No way, because we need to have this because of the law. The law requires that we keep that data in our archive." And then he did something clever.
12:03 GZ: He asked Google to remove this from the search results. So if you search for his name, you will not find the link to the newspaper with the announcement that his properties were sold, auctioned. But that's the right to be forgotten. We've already got that. In my firm, not me, but other partners, other lawyers are working on the right to be forgotten cases in the Netherlands, and we represent Google, of course. We already got that. Still, in the GDPR, there is a provision, an article, on the right to erasure, and this is not something I edit, these... Is it semicolons? But this is something that's actually in the text of the GDPR, the right to be forgotten. Right. And it basically says that if you, as a data controller, get a request to remove some data because you don't need them anymore and you decide, "Yes, this data subject is right, I will remove the data." If you do that, then you also have to inform the third parties that got the data from you that there was a request to remove the data. And then the third party has to think, "Well, should I remove it as well?" That's what it says.
13:32 GZ: So it's new, but not... It doesn't have a real big impact, I'd say. Automated decision-making and profiling. On the Dutch Data Protection Authority's website, it says you cannot profile unless with consent. That's not right. The law says if you do profiling and automated decision-making, and it has legal consequences or significant effects for the data subject, then the data subject has the right to object, to stop that profiling, unless, and unless are three exemptions, unless the law requires the profiling. So in tax law... In the Dutch social domain, in tax law, there are some provisions that allow for profiling to make a risk-based supervision system regime. Unless it's necessary to execute a contract with the data subject, could anyone think of a contract that requires profiling?
14:49 Speaker 2: Mortgage? Mortgage?
14:53 GZ: Mortgage? Don't think so. No, because you can give a mortgage without profiling. It's, of course, nice to do it, to make a risk analysis. I'd say Netflix. Netflix, they do not only allow you to watch TV series, House of Cards, etcetera, but also they make recommendation, and that's part of the contract. They profile you, and that's the contract. The service you buy from Netflix is also the recommendations. Maybe a dating site, you want to match certain profiles? Well, the profiling is part of the contract.
15:33 Speaker 3: Application? Like an application to a university?
15:43 GZ: I don't think you enter into a contract, and to execute that contract, you would need to profile. I don't think so.
15:51 S3: When you submit the application, you...
15:56 GZ: I think there can be profiling, but I don't think that's the execution of the contract with the data subject. The contract with the data subject...
16:01 S3: The university's comparing the different applicants, so that's profiling.
16:09 GZ: I'm not convinced.
[chuckle]
16:10 GZ: Talk with the DPA, I go.
[chuckle]
16:17 GZ: But, of course, we first would start looking at the consequences, the implications. Does it have a legal effect? Or, and this is more relevant, similarly, significantly affects him or her, the data subject. Well, if it's about advertisement, adverts are now happening with cookies and the microsecond auctions of profiles. If it is personal data, that's the first question to be answered. But if it's personal data, then the profiling results in that you get a certain advertisement, a banner. Does that significantly affect you? I'd say no. It may be annoying, but that's it. And if you're really annoyed, get an ad blocker, or use, what's it called, incognito or in-private browsing, and you can minimise the risk. However, if you are profiled, you get an invitation for a job interview, or a mortgage, that significantly affects you. So the first question to answer is, how serious is it? And then you start looking at the exemptions. But for a university, if it's about I can enter into this university or not, I think that's a very significant effect.
17:50 GZ: So, moving on. Data portability. Data portability it is quite interesting. Data portability. It only applies in two situations. If the processing grant, the basis for processing, is the data subject's consent or exclusion of a contract. So if it's done for a legitimate interest, not overridden by the interests of the data subject or a legal obligation or of public task, then it doesn't apply. Makes sense, of course, because it would be something that you ask the tax authorities, "Well, can I get my data, and remove it from the... " That doesn't work. It only applies to the data provided by the data subject. And we'll go into that later on. What does this right say? You can get the data in electronic format, get it for yourself, or even have it transferred from one controller to the other controller.
19:01 GZ: When the lawmaker made this right, it was to give the users of Facebook the possibility to remove all their data from Facebook to another social network. I don't know what other social network. Maybe Google+, but who uses Google+? I asked my students and they don't even know. "Google+?" So it wanted to give the data subjects control. If you do not agree with Zuckerberg's privacy policies, you at least have the right to get it all and put it in another social network. What I see in practice is something completely different. It's not bad, it's not wrong, but what I see is that, for instance, an electricity company, could put a button on its website, and say, "If you want to have a tailor-made offer from us, click here, give us your consent, and then we'll get your data from your current electricity company. We get the data, and we'll give you a nice tailor-made, personalised offer." Insurance companies do the same thing. You have a car insurance by that company, click here, we'll get your data from your current car insurance company, and we'll present you with a nice personalised offer. I don't think anybody expected that this would happen, but it's happening right now. At least, that's what I see.
20:47 GZ: One word about the data provided to the controller. When the law maker, when the EU law maker, made this data portability right, they were thinking about the data you give them. So there's an online form and you provide them with your data, with your details. And maybe you upload a photo, and then you have a right to get those data transferred to another social network, or get it yourself. The Article 29 Working Party, which is the body where all the DPAs in Europe talk with each other and present views and opinions on the interpretation of the concept of data protection law. As a slightly different view, they say, it's also about data that is generated on the basis of activities of the data subject. So that's more. In the electricity company example, it would also be the data in the smart meter. If it's about web surfing behaviour or a wearable, it's also that type of data. So for a wearable, could be your heart rate, whatever.
22:04 GZ: This is very controversial, and the European Commission, I haven't seen them doing that often, but they even sent a letter to the Article 29 Working Party. "We do not agree with this interpretation of the concept of provided data." Who do you think are the controllers that object the most to this? The telecoms companies, because, of course, they have a lot of data generated on the basis of the activities of data subjects. All the location data. So we have a lot of uncertainty right now. Article 29 says this, European Commission says that. This is how I present these platforms. Platforms that are developing right now, where you can get a button to get data from the competitor.
23:06 GZ: I'm gonna finish. And this, of course, is a terrible slide, but you can see the red letter. This is actually quite difficult. If you provide information to the data subject, it needs to be concise, easily accessible, easy to understand, in clear and plain language. Very obvious, but still it is difficult, because you have to provide a lot of information, and you have a very small screen. So what would be the best approach to that? Layered. Best practice is layered. You give it one sentence. If the data subject wants to know more, click, gets maybe two paragraphs. If he really wants to know it, click another time, you get the full privacy statement.
24:03 GZ: I don't think Apple complies. I installed a new iOS last weekend, it took me a whole weekend to read it. It's longer than a Shakespeare play. [laughter] The privacy statement to that. And Apple is no exception. Okay, that's it for me, thank you.
[applause]
24:28 S2: Thank you, Gerrit. I wanted to go right back to the beginning of your presentation, where you had the slide with your photo, and you were talking about sensitive data. So we all live in an era of social media, I've been sat here tweeting away, and many academics are now on social media, helping to communicate their research, and many universities will be on social media. So what's the legality around sharing that image out to the world? 'Cause, obviously, there are racial aspects that we talked about.
25:01 GZ: If I put it on LinkedIn or other social media, I don't give permission, I do not consent to every conceivable use of the photo. Of course, I take the risk, and it happens, but I don't consent to that. That's perfectly clear. And in this case, we actually need explicit consent, so the bar is even higher, because it's ethnic data, it's racial data, you can see the color of my skin. The same rules apply to health data, data on religion, political beliefs, criminal data. But if I publish it on LinkedIn, of course, I cannot prevent people from looking at it. Actually, I do it for that purpose. And because I make it evident to the world, I publish it, I cannot use these specific restrictions for ethnic data, racial data, but still you need to have a processing ground and there's purpose specification, and purpose limitation. So clients sometimes ask me, "Well, we collected all these data from LinkedIn and we have a nice big database." The first thing I ask is, "Did you inform these recruits or possible recruits?" "No, we didn't." "Well, I think you should, because publishing it on LinkedIn does not imply that you consent to that."
26:28 S2: So if we're at an event like this, and someone tweets a photo of you, which also contains that data, is that slightly different, or...
26:38 GZ: Not slightly different, it's more or less the same. What you could look for, but maybe only if it's a closed group, so usually Twitter is to the whole world, but if it's a closed group, then you could say, "Well, this is, it falls in the personal, or household exception." Then it needs to be closed. But could be closed to 10,000 people.
27:01 S2: Okay. So let's see some of the other questions that have come in. First up, do retention periods apply to students' data such as grades?
27:10 GZ: Yes, but I guess there will be quite long retention periods, might even be in the law. I was thinking about what could be a public task of a university? We do job interviews with law students, sometimes we ask the student, "Could you give up some names so we can call?" Sometimes you also ask, "Well, we're gonna check this with the university, if you don't mind." And, of course, they don't mind. But, I think... I'm not sure how the law is here, but I think the university has a legitimate interest to keep these grades for quite a long time because of this, and fraud with diplomas is a serious issue, and the only way to resolve that is by keeping the data. But, again, you need to inform the students about that. If you haven't informed, it cannot be right.
28:03 S2: Okay, similar questions I want to just ask, how does the right to be forgotten translate into social media? Can you still prospect on LinkedIn, Twitter and Facebook?
[pause]
28:24 GZ: Well, the right to be forgotten is the right that you ask a search engine, that could also be Facebook, "I do not think... " You claim, "The data you have on me is irrelevant. Please remove it." And if it is indeed irrelevant, and there's no other... It changes when you are a public figure. If you are a politician, then everything changes. Because for a politician, we have different rules, and we need to know what they did in their past. But for people that do not seek the...
29:01 S2: The limelight.
29:01 GZ: Public attention, the limelight, if it is irrelevant, it needs to be removed. I'm not sure this answers the question, but maybe I do not understand the question.
29:11 S2: Okay, let's go to the final question on here, can profiling be used to make a non-automatic decision?
29:19 GZ: What the law says that these rules, so you can object to automated decision-making, including profiling. This was also something we talked about for an hour with the associates. What does it mean? Can profiling be used? Probably it can. What is profiling? The GDPR gives a definition of that, is that profiling is evaluating certain aspects of an individual to make decisions on that individual. So it could be your past behaviour, it could be your location, it could be anything. So of course, you can use profiling to make a decision manually, not automated. But the law applies when it's automated.
30:06 S2: Okay. And I suppose it's more complicated if we're doing it manually, anyway. More time-consuming, etcetera.
30:12 GZ: Yes, but it happens for thousands of years. Two buses collide, and the doctor steps in. And then he walks through the bus. And people that are screaming, he will not pay attention. But the people that are really silent and bleeding, he makes an assessment. And he evaluates, "Well, okay, you are not good, but you are not dying. You broke your leg. Okay. You can suffer for another half hour. But this person needs medical attention right now." At that moment, he is profiling. We are profiling all the time. That's not the issue. The issue is Kafka. The issue is profiling, making decisions on the basis of a system. Profiling is computer says no. You know the sketch, Little Britain. That's what this article is about.
31:01 S2: Yeah. Okay. We've probably got time for one more, and then we'd better move on, 'cause we're a little bit over time. The gentleman at the back?
31:08 Speaker 4: Just on the profiling one again. So if, say, for employment purposes, to pass selection, a company was using psychometric tests where the output essentially is automated, do they have a legal... Because that's pre-contract law, and it potentially...
[overlapping conversation]
31:26 GZ: The same applies. Pre-contract, the same applies. However, in practice, what the lawyers will tell this company, you do not make a decision. You give an advice. And that makes all the difference, because then the decision is not made by the computer. It's not, computer says no. The decision is made by an individual. It's a bit cosmetic. I realise that. But this is how the lawyers resolve this.
31:57 S2: Okay. Well, let's give Gerrit a big round of applause, everybody.
The development and maintenance of an in-house system is a complex and time-consuming task. Full Fabric lets you turn your full attention to maximizing growth and performance.