Podcast episode 24: COVID-19 and our digital rights Pandora’s box

Illustration of Michael Geist

Testing is the domain of doctors, but tracing COVID-19 is falling on telecom and software companies as they race to develop systems for monitoring the quarantine. But at what cost? 

Internet rights advocate and CIGI fellow Michael Geist says we are at risk of opening a Pandora’s box in an effort to protect our health, but that there are solutions.

Below is a transcript of this conversation. Some parts have been condensed for clarity.

You can listen to this episode wherever you get your podcasts. 

Subscribe

Michael Hainsworth: The glowing rectangles almost all of us carry offer a powerful tool in fighting COVID-19. The scientific community says test and trace. Testing may be the doctor’s domain, but tracing using technology is what will help keep the virus contained. How do we track ourselves without violating our privacy rights? Michael Geist is a digital freedom fighter. He’s also a Center for International Governance Innovation senior fellow and a Canadian law professor. 

We began our conversation with a bit of Greek mythology. Sickness, death, and other unspoken evils are the results of Pandora opening a box. When she saw what she had unleashed on the world, she quickly closed it, but it was too late. I asked Geist if he feels like COVID-19 has opened a digital rights Pandora’s box.

Michael Geist: It’s certainly raised issues that we’ve been grappling with for a number of years now around surveillance, the role of telecom companies, the role of Internet companies. And it’s done so in a way where there is, of course, heightened concern both about those civil liberties and at the same time real concerns around public health. And so there is a willingness I think amongst politicians and certainly some in the public to entertain policy approaches that might have seemed a bit beyond the pale even just a couple of months ago.

Hainsworth: Is that a concern to you or do you think that we are taking a sober approach to this?

Geist: I think a bit of both, to be honest. I think, on the one hand, these are really unprecedented times, at least for most of us in our lifetimes. And so I think that given the hardship that this is creating economically from a health perspective, from a mental health perspective, I think it’s incumbent upon us to explore every possibility and every potential approach. And much of what’s being talked about, when it comes to using technology in terms of tracing or trying to identify where people are going, is largely looking ahead to second waves and when we get past this initial hurdle. And I think a lot of people recognize that this may be with us for a very long time and if there are solutions out there that may help, we ought to be thinking about it. 

At the same time. If we think about the role that technology has played and the willingness of some to simply say, “Well technology will solve some of these issues,” there is, I think, a legit concern that we may jump into potential solutions that are ineffective but do have the effect of increasing the level of surveillance in society. And that perhaps once those policies are established and implemented, it will be very difficult to undo.

Hainsworth: I’m reminded of Benjamin Franklin’s quote about liberty and security, and those who’d give up liberty for some security deserve neither. Does that quote apply here?

Geist: You’re not the first to have raised that kind of quote when we talk. We see other people talking about zero-sum and the notion that we can have both. I do think that there may well be some trade-offs here though. 

I think it’s difficult to, on the one hand, say hundreds of millions of people in countries around the world have to stay at home. And once we try to find ways to get people out, if there are mechanisms that allow that to happen that mean more lives are saved by adopting technologies and using different kinds of tools that may be effective in trying to lessen some of those terrible impacts, we ought to be thinking about them. 

The challenge is, how do you do that without making it permanent? How do you ensure that you’ve got appropriate oversight? And how do you ensure that the solutions that you’re proposing will actually be effective? I think if there’s been a source of criticism from many in the privacy and civil liberties community, it hasn’t been that we shouldn’t be exploring and working towards some of these potential approaches. I think there is a recognition that this is a time where we need to be thinking about some of the approaches that might have been dead stop no’s in the past. 

But one of the conditions moving forward on this has to be that it’s going to be effective. It’s actually going to achieve what you’re hoping it’s going to achieve. And at least with some of the proposals, I think there’s a wide sense that in fact, we will not have the kind of effectiveness that people are looking for.

The problem with using those sorts of technologies is that they’re simply not good enough to meet the desired outcome. 

Hainsworth: You’re suggesting that the concern here is that we’re going to leverage this technology – 80 percent of North Americans have a smartphone. Pretty much every adult on the planet who wants one has one. That we’re going to leverage this technology but not do it in a sufficiently protective way, and at the end of the day also not in an effective way for fighting COVID-19.

Geist: Yeah I think that’s part of the concern. There’s really almost two roads here that we could be talking about. One concern would be, what would one of these plans look like and will we have the necessary precautions, transparency measures, oversight, time-limited approaches to try to ensure that this is not a new normal? That to the extent to which we adopt certain measures that we probably would not have in the recent past, they are here for this particular emergency situation and not beyond. And that’s a tough thing to do. History tells us once it’s there, it’s pretty tough to undo, but nevertheless that’s going to at least be the goal. 

Even beyond that, before we jump into some of these approaches it is incumbent to demonstrate that in collecting this kind of data, in using these sorts of technologies, it’s actually going to work. And I think we’ve already got a fair amount of evidence to suggest that at least some of the proposals out there are unlikely to be effective in the way that some of the proponents might otherwise hope.

Hainsworth: What’s the component of those proposals that leads you to believe that that proposal would be ineffectual? Is there a common nugget or thread within them?

Geist: That’s a good question. There are in some ways two sets of proposals, at least with respect to some of the location tracking type proposals that we see coming out to deal with this issue. Some have focused on cell site location and GPS. The idea that we could identify where people are going. This could be very useful potentially in a number of different ways. 

Who did you run into? Are you close to someone who may have been exposed to the virus or may have the virus? Those kinds of things. 

The problem with using those sorts of technologies, at least those that are experts in these technologies say, is that they’re simply not good enough to meet the desired outcome. So in other words, GPS is really good about tracking where we happen to be located. Almost all of us will use these sorts of technologies, whether for Google maps or a range of other tools that are out there. They’re great, but they’re not so great at identifying where people are located within six feet of one another, which of course is the social distancing metric that we’re all using right now. So it isn’t actually going to help us in terms of that proximity. 

There are other technologies and these are the ones that are picking up a bit more attention and I think a bit more favor. And that has to do with approaches using Bluetooth technologies. And so some of the approaches that we see right now are more focused on Bluetooth, which don’t involve the longer, wider kinds of surveillance but are or could be useful in terms of people who are in very close proximity to one another. And if we were looking for a tool to deal specifically with some of that close proximity questions: Is someone nearby? Am I or have I been at heightened risk? Those tools potentially could be useful.

Hainsworth: I feel like I’m leaning heavily on Greek mythology here. What about the Achilles heel of either a GPS-based tracking technology or even a Bluetooth-based tracking technology? The GPS one doesn’t rely on someone having to install an app to track them whereas the Bluetooth technology would.

Geist: The privacy way of asking that question would be, is this going to be a consent-based model, or is it one in which you are going to be mandated to have this? And that is I think where there is a great deal of concern. 

I think everyone coming out of the privacy community will be adamant, and rightly so, that any proposed solution must be done on an opt-in basis. It cannot be that governments mandate that we are tracked in this way. That raises real concerns from a surveillance perspective and the nice thing, so to speak, about a Bluetooth approach is that that’s an opt-in type basis. You would choose to install the app. You would choose whether or not you wanna participate. 

Now admittedly, that may undermine its effectiveness and its reliability. Which raises real concerns about, can you have an opt-in system that achieves what you hope to achieve? Given that it is subject potentially to people gaming the system or tricking the system or simply saying, we’re not willing to participate at all. We’re really relying on, if we move in that direction, on the public saying, we’re all in this together. This is something that from a public health perspective, we’d all benefit from, and so we’re gonna ask everybody to participate and we don’t know what that outcome would be. 

We, of course, do know that broadly speaking people have been really compliant [in Canada] when it comes to the public health requests that we’ve seen unfold over the last number of weeks and months and so that the hope would be as we open things up and there is, there are further requests to say, this could be helpful for your own health and safety, as well as the health and safety of others, that people will participate.

Hainsworth: Do you see that as the ultimate end game, that we need to use the reopening of any given economy or society as the carrots for the stick of ‘we’ll reopen your economy but you’re going to have to install this on your phone’?

Geist: I don’t think we’re there yet. I actually think that when you talk to the epidemiologists and healthcare officials and the like, they’re very consistent. When they talk about reopening, they talk about widespread testing and they talk about contact tracing. And then, of course, ongoing use of social distancing measures. 

First, take care to try to limit the likelihood of transmission as much as possible. That’s something we’ve been dealing with now for quite some time and it seems like will be with us for much longer. So everything of course from masks to hand-washing to the social distancing measures in terms of how close we come to one another. 

Beyond that though, it’s pretty clear that you need more than just that to reopen. And so part of it is widespread testing to identify those that have the virus, including the many who may be asymptomatic, which is where some of the challenges lie. And then there is another element of who may be at risk of having the virus. And that is where the contract tracing comes up. The use of this technology hits at this last point at the contact tracing side of things. 

The ability both to warn people that they may be in real-time in contact or even more, later on, a week later or something notifying that you were in close proximity to someone who did contract the virus, highlights that you may have been put at risk. And so you might then take additional steps. You might say, well, you know what, I’m not feeling anything but perhaps I’m asymptomatic and I came into contact [with someone who tested positive] and if we’ve got testing that is widespread enough, we might be in a position to say, okay, you should get tested. Even under those circumstances.

We need to ensure that the public is aware of what is happening, what is being collected, what the limits are in terms of how that information is used.

Hainsworth: According to one website, 23 countries are currently using contact-tracing apps on smartphones. Do you see any country doing it right?

Geist: That’s a great question. And we do see a pretty wide range of approaches. Everything from mandate approaches to opt-in type approaches. The Singapore app, often referred to as the TraceTogether approach, which uses Bluetooth and is an opt-in model is the one that’s been highlighted most commonly as one that’s been fairly effective. It itself raises issues though because Singapore has now seen an upsurge. So whether or not it’s fully effective, it clearly doesn’t solve the issue. 

Part of it also depends a little bit on what you’re trying to achieve. Sometimes these apps are used for contact tracing approaches. Other times they have been used in places to try to ensure that people are respectful of self-quarantine requirements. We see that in places like Taiwan. We’ve seen Israel seek to move in that direction and so they could rely on GPS in that instance because all you’re really trying to do is identify whether someone has left their home and how far they’ve left their home.

I mean those can be effective and certainly some of the reports coming out of those countries is that if people are under self-quarantining requirements and if they decide to take the phone with them, you can identify whether or not they’ve moved around. And then follow up, I suppose, to identify whether or not there was a legitimate reason for having done so. But I think that that approach will still leave many pretty uncomfortable. Noting that this is not an opt-in approach. It’s one that certainly has a big brother surveillance type field where we really are surveilling people. 

Now there may be some that will say that, in a current pandemic type situation, if people are putting their communities at risk and that’s why we have to self-quarantine, we need to find mechanisms to ensure people are respectful of that. Right now, we’re using the prospect at least in some communities of bylaw violations and penalties for violating self-quarantining requirements. But these technologies demonstrate that we could go further, but there is unquestionably a major price to pay from a privacy perspective. And I think many would say that it ought to be a non-starter just for that reason.

Hainsworth: Well 28 percent of these apps have no privacy policy whatsoever. We also live in an age where when we get a privacy policy popping up on our screen, we don’t read it. We just click accept. How do we square the circle on that? I’m asking someone who is focused on digital rights in society. Does it even matter anymore?

Geist: Well, I don’t think that the privacy policy is the solution in this instance. I mean we’re talking about potential proposals and if they’re not opt-in, that it doesn’t really matter what the privacy policy says. And I do think that even if we are operating on an opt-in basis, given the active involvement of public health authorities, we’re looking for something far more than just a privacy policy. I think we are looking for full transparency to be sure. And so we need to ensure that the public is aware of what is happening, what is being collected, what the limits are in terms of how that information is used. And clear limits on potential further disclosure and the need to destroy that data where possible, and in some instances, it may be data that we aren’t using for any personal identification purposes, but rather for trendspotting from a public health perspective to try to identify where outbreaks may be occurring. So I think there needs to be full transparency, which would effectively be the equivalent of a privacy policy, but that really is only the start of what I think you need in that circumstance. 

You need strong oversight so that there’s an audit function; There’s the ability to ensure that there is not misuse; You need strong penalties for potential misuse and so there need to be arguably criminal penalties for those that might misuse this kind of information. One would hope that doesn’t happen, but we know historically there are risks that it could. And we need sunset clauses, I would say. 

We need to ensure that, if we do move in some of these kinds of directions, it is built into the system that it is designed to be temporary in nature. So it is not something that is permanent but rather has to be actively renewed every 30 days or every 60 days or whatever an appropriate number happens to be so that there is a continual need to justify and rejustify, should we continue to extend these programs, their ongoing existence.

Hainsworth: You’ve written that there are at least three types of data that may be collected and used to counter the spread of Coronavirus: aggregated data, cell phone location data (of specific users) and identifying cell phone location data of those who came into close proximity of those in the second group. Are any of these scenarios viable without giving up privacy rights?

Geist: Well, the hope would be certainly the first would be, because in that instance our interest is not with a specific individual which is where privacy law typically falls, but rather in broader communities. And so if we are interested or concerned about increased spread of the virus in particular locations, it isn’t the particular individuals per se that we’re interested in, but rather in the trends. 

You might look at particular communities, low-income communities, or communities in and around hospitals or where homeless may be gathering. There could be a range of different kinds of places or communities where there have been certain activities where many in the community may have come out. And to the extent to which we are able to gather data that places the spotlight on whether or not there may be an increased spread of the virus in those communities. 

If we think about living in a more open situation than we’re experiencing right now, but one in which we are still battling the virus, still practicing social distancing and still fearful and experiencing regular spikes of increased spread of the virus, that’s the kind of information we might need. And it’s possible these technologies could help provide that to public health officials and do so without implicating a specific individual. 

Once you get into the other kinds of data where we’re talking either about specific individuals, let’s say being consistent with self-quarantining requirements or having the virus and, the prospect of people having come into contact with the virus or providing a warning to others that they may have come into contact with those people, it’s inevitable that that involves personal information. It’s highly personal, it’s highly sensitive personal information and that’s why the kinds of safeguards and limits that we’ve been talking about are so absolutely essential if we’re able to engineer a system that is respectful of privacy and that moves in this direction.

It’s highly personal, it’s highly sensitive personal information and that’s why the kinds of safeguards and limits that we’ve been talking about are so absolutely essential.

Hainsworth: Tell me more about those safeguards. What role does the telecom industry play in providing that data in a safe fashion?

Geist: I think there’s a number of kinds of things that we have to think about so that there’s an appropriate vetting process for the system itself to ensure that it, in fact, is doing what we need to be done. I think it needs to be optional. Opt-in, consent-based approaches are I think absolutely essential. We need regular testing to ensure that it works. We need data minimization to ensure that only the minimal amount of data to meet our objectives are there. And I think realistically we need substantial adoption, which with certain respects, represents a significant risk. 

Are we going to get substantial numbers of people adopting this such that it’s effective? On the other hand, I think it does tell us that if we are going to be successful in getting that kind of adoption, we need these kinds of measures. Because one of the things that we’re going to have to do if we move in this direction is convince the public that it is in their personal interest and in the broader public interest to participate in this. And I think the only way that you can get large numbers to participate in this, is both to demonstrate the efficacy of the solution and then even beyond that, that there the other risks that may come with this, such as the privacy risks and civil liberties risks that come with this have been effectively addressed. You don’t get widespread adoption if people feel that they are having to trade their privacy in a way that they feel is unreasonable in order to obtain some of these health benefits. Benefits that they themselves may not get but they’re really for the broader societal health benefit.

Hainsworth: Let’s come full circle on the Pandora’s box metaphor. Some scholars interpret the box, actually a jar as containing good gifts that escaped not evil. And the story ends with the only gift that remained in the box as Pandora quickly closed it, was hope. Do you have hope that we’ll avoid this dystopian outcome some fear?

Geist: Well, you’re talking to someone who’s pretty pessimistic, to be honest about not just that question, but frankly the coronavirus more generally. I think we have been underestimating how long and how hard this is going to be and I think we continue to do so. I say that as a non-expert in any of these issues, but as someone who can read math tables and can see historical trends in other pandemics, and I’m left wondering why we think this is gonna be so significantly different. 

So I think this is going to take a long time. And I do think that the new normal that we live with will not be the kind of normal that we’ve been accustomed to for our lifetimes. And it’s gonna take a very long time to get back to that new normal. 

If this is something that, you know, in late 2020 and into 2021, we are still grappling with, still hoping for the vaccine – and then even once we hopefully do get a vaccine, take months until we get the levels of production that are necessary to get it out into the broader community, which suggests it’s gonna take a very long time – the pressure to identify potential solutions and try to bring us back to as close to the old normal as possible will only increase and I believe that will push us even further to being willing to explore approaches that previously we would have said are simply unacceptable. 

And so I’m pessimistic about how long this is gonna take and I have some amount of pessimism about the adoption of some of these technologies. Not because I don’t think that there isn’t some hope with them. I actually do think there are opportunities and I think it is incumbent on us to explore those possibilities, but rather because I think the desperation for solutions is going to increase day by day, week by week, month by month. And so willingness to engage in what is perceived to be a trade-off will only grow.

About Futurithmic

It is our mission to explore the implications of emerging technologies, seeking answers to next-level questions about how they will affect society, business, politics and the environment of tomorrow.

We aim to inform and inspire through thoughtful research, responsible reporting, and clear, unbiased writing, and to create a platform for a diverse group of innovators to bring multiple perspectives.

Futurithmic is building the media that connects the conversation.

You might also enjoy
Illustration of Mirjana Scheele
Podcast episode 25: Cell Tower Climbing under COVID-19