James Dempsey has been a leading expert on privacy and Internet policy for three decades. From 1997 to 2014, he was at the Center for Democracy & Technology, where he held a number of leadership positions, including Executive Director (2003 to 2005) and head of CDT West (2005 to 2014). In August 2012, after Senate confirmation, James was appointed by President Obama as a part-time member of the Privacy and Civil Liberties Oversight Board (PCLOB), an independent federal agency charged with advising senior policymakers and overseeing the nation's counterterrorism programs. He served in that position until January 2017.
James offered a legal perspective on data and privacy.
Steve, thanks very much, and Long, good to see you again. Delighted to be here today. Hello to everybody. Good morning. Good evening. Eric, good to see you. Congratulations, Long, really, on both the overall achievement in creating the Luohan Academy, as well as in producing this latest in a series of reports of really opening up the Ant Financial or Ali data reserves to research, which I think is critical. So, I really appreciate the openness that you've shown here in this Dialogue.
So, there were a couple of things about the report that I particularly liked and particularly want to call out in terms of privacy. First of all, the report makes it clear that privacy is not really about data ownership, (and that) privacy is not well understood applying ownership principles that really doesn't address what we're getting at, when we have all the concerns that we share about privacy.
Secondly, privacy is not only about control. Sometimes researchers said, and it's been said for years, that privacy is the right to control information about yourself, but that can be misleading. And particularly it can be misleading because it leads to too much of a focus on the consent-based model of approaching privacy. And as the report points out, as we all know, consent is too readily given and notice, and consent really ends up not getting to what is sort of the core of the privacy problem. Unfortunately, policy makers and the laws that come forward,(for example the GDPR (the General Data Protection Regulation, in EU), Patrick mentioned in his discuss that the GDPR as being all in on consent. We recently have adopted two successive major privacy laws here in California (including CCPA, the California Consumer Privacy Act; and CPRA, the California Public Records Act), which just put a lot of emphasis on notice and consent.
So, as the report points out, previous research really doesn't address the kinds of concerns and leads to potentially an image that you could somehow slow down the flow of data, and that could then have the adverse effects in terms of denying the benefits of data.
Now, there is a new angle which I'm going to discuss in a minute, which I think gets us more to the core of privacy, or at least of what I think big data and its use implicates. So, the aspect of privacy that I think we need to focus on is privacy as fairness. That is, fairness in the use of information to make decisions about people. And the report quotes from Douglas North (North, 1990) about the fundamental theoretical problem underlying cooperation, which is how individuals attain knowledge of each other's preferences and likely behavior. And I think that quote actually helps illustrate or explain even the privacy paradox.
Consumers understand the value of preference and consumers understand and accept the trade-off. For instance, I give you my location, and you give me mapping services. I give you certain profiling information, and you give me better ads in return. That explains in my view why consumers do share their information because they share it in return for a service that's immediate value to them, whether it's better search, whether it's location, or for that matter, a loan or other benefit. And of course, Ant Financial has been a leader in the micro and small and medium enterprise financing. It has made just huge, huge breakthroughs there.
But it's the likely behavior part where the fear, the distrust, and I really think it is fear, that consumers fear the misuse of their data in ranking them, in judging them in all the critical ways in which data is used. As to employment – are you going to be a good employee? For credit – are you going to be able to repay this loan? And for housing – do you deserve, or are you qualified for this unit of housing? For insurance – are you a good insurance risk? All of the predictive uses of data are all very important and very powerful, and yet, that's where the fear factor comes in and that's where the distrust comes in. And some of the effects are quite well justified.
Just looking only at the United States, obviously this country has a long history and continuing deep problem of racial bias in our society. And we see it in, for example, the Facebook uses of big data to deliver advertisements related to housing, deciding where it was shown. And the Department of Housing and Urban Development brought a complaint against Facebook, because even though the landlords didn't intend to discriminate, and even though the landlords were not breaking the law, and weren't saying, "Show our ads only to whites and not to blacks," the Facebook algorithm delivered the ads in a racially biased way, using big data.
And this to me is the sort of future of privacy, and privacy is such an inadequate word for this. But at least in the United States, our Federal Trade Commission, I fully believe, is on the hunt for algorithmic bias cases. They've issued a set of guidelines. They held a round table. Those are two precursors in their way of operating to bring in real cases. The Department of Financial Services in New York State has issued a guidance on use of big data in insurance rating, which of course has been around for a long time, but they are looking carefully at this. There's a growing amount of litigation in the United States in the public benefits sphere including disability, welfare, other decision-making, which is increasingly being based on decision-making systems that are driven by big data.
So, to me, this is the next phase and should be the next phase, and perhaps the Luohan Academy can take this on as the next phase of some of its work on two different things like how to achieve fairness and how to demonstrate fairness at scale in the use of information. The traditional privacy frameworks are focused on redress, challenge, accuracy of data, et cetera, but I think there needs to be, and there is some work ongoing already, it needs to be a lot more work in terms of de-biasing, in terms of identifying bias in systems before they are deployed. So, the value proposition of data is proven. The value debate in my mind should be over with. And the focus instead should shift to these questions of achieving and demonstrating fairness.
“,…the fundamental theoretical problem underlying the question of cooperation is the manner by which individuals attain knowledge of each other's preferences and likely behavior” North, D. C. (1990). Institutions, institutional change and economic performance. Cambridge university press.
 See, for example, https://www.ftc.gov/site-information/privacy-policy.
 See, for detailed meeting info, https://www.ftc.gov/news-events/events-calendar/2010/03/exploring-privacy-roundtable-series
 For detailed regulation, see https://www.dfs.ny.gov/industry_guidance/circular_letters/cl2019_01
For more information, please visit Luohan Academy's youtube channel: Luohan Academy