On April 26, the iPhone maker, Apple, released changes on its operating system that allows users to choose whether their apps can track them online. A pop-up screen asks users whether to “Ask App Not to Track” or to “Allow”. Most users were expected to say No. Already, Apple’s App Tracking Transparency push has sent reverberations throughout the $350 billion a year digital ad industry. Prior to these developments, the Luohan Academy’s Executive Director Long Chen and Columbia Business School Professor Patrick Bolton sat down to discuss what it means to think of data as a public good and the implications of putting the individual in charge of giving consent. Professor Bolton suggests consent-based privacy protections are based on the wrong principles. He also explains why liability rules and fiduciary duties could protect privacy better than using a property or ownership-based framework.
Our Guest:
Patrick Bolton-Columbia Business School
Transcript:
Debra Mao (00:01):
You're listening to Luohan on Air presented by the Luohan Academy. I'm your host, Debra Mao. The academy is an open research institute based in Hangzhou, China. Backed by Alibaba Group, it addresses the tremendous social and economic transformations brought by digitalization. This podcast features conversations with thought leaders from all around the globe about the most pressing issues in these fast moving times. On April 26th, iPhone maker, Apple, released changes on its operating system that allows users to choose whether their apps can track them online. A pop-up screen asks users whether to ask app not to track or to allow. Most users were expected to say no. Already, Apple's app tracking transparency push has sent reverberations throughout the $350 billion a year digital ad industry. Prior to these developments in early April, the Luohan Academy's Executive Director, Long Chen, and Columbia Business School Professor, Patrick Bolton, sat down to discuss what it means to think of data as a public good. And the implications of putting the individual in charge of giving consent. Professor Bolton suggests consent based privacy protections are based on the wrong principles. He also explains why liability rules and fiduciary duties could protect privacy better than using a property or ownership based framework. Here is their conversation.
Long Chen (01:34):
Patrick, so perhaps as introduction, how would you describe yourself as an economist and what field do you study?
Patrick Bolton (01:52):
Thank you Long. Yes, let me start with that. Maybe the best is to say how I got into economics. At university, I studied political science first in Paris and then economics in Cambridge, England, and then mathematical economics at the London School of Economics. My PhD dissertation covers topics at the intersection of contract theory and industrial organization. This has remained my area of interests up until now. I completed my dissertation at MIT and my main advisor was Oliver Hart, and I was fortunate to have two other advisors, Jean Tirole and Eric Maskin. After completing my PhD, most of my research has been in the general area of contract theory, and later my research has focused more in corporate finance, monetary economics, financial regulation. In the last 10 years, I focus a lot of attention to the topic of climate change and finance.
Long Chen (03:20):
Perhaps I should have asked you what you do not know. But I guess because you are really great in contract theory, can you give a basic background to our audience, what does contract theory do and why we need a contract theory? What are some general topics that are studied?
Patrick Bolton (03:43):
Yes, so most economists today, they take as their unit of analysis the individual, the individual consumer or individual producer. But, in many ways, a more appropriate unit, basic unit, is the economic transaction, an economic transaction. Because trade is at the center of economics and relationship based transactions are at the center of trade. Contract theory is really the topic of studying trade as a basic unit of basic unit and studying the relationships and institutions that govern trade. Unlike in other fields of economics like say, for example, macroeconomics, contract theory is interested in institutions, the rules, the contracts, etc., that govern and facilitate trade. So it's really at the center of economics, I would say.
Long Chen (05:07):
Because today's main topic is data governance, what can we learn from contract theory when applying to data governance?
Patrick Bolton (05:19):
The way I understand data governance is all the rules that govern the production, integrity, and use of data. So that's a broad definition. When you apply contract theory analysis to data governance, the first key point to bring to bear is that data is really a public good. Then you can understand governance from the perspective of how do we govern, how should public goods be governed? The problem with data production, data use, preserving data integrity is that we're really looking at a situation that people describe as private provision of public goods. When you have a private provision of public goods problem, you have to worry about creating adequate incentives to produce and preserve data and adequate incentives to encourage the use of data in a socially valuable way. That's what contract theory framing allows you to do. Now, one of the key topics, key themes in data governance is the question of ownership of the data. For example, many people have suggested that the customer or the data subject should be the owner of the data. For example, Luigi Zingales who's a Professor at Chicago, at the Booth School in Chicago, has prominently pushed this line saying that the customer should be the owner of the digital connections that the customer creates the data created through online searches. He's been advocating for a new change in the law that would enshrine this ownership. In Europe, you have the rules around the GDPR that in effect give ownership of the data to the user and the data subject by default.
Patrick Bolton (08:01):
As everybody, I'm sure, who's listening in knows the consequence of GDPR, is that the first thing that happens to any user online is that whatever website you're visiting nowadays, the first thing that comes up is a pop-up asking for a consent. I just I thought it would be good to give an example of that. So yesterday I looked up a synonym for the word atom. I went to the website, thesaurusnewdictionary.com, right? The first thing that happens is the pop up went up. Let me read you what it says, "We and our partners store and/or access information on a device, such as unique IDs in cookies to process personal data. You may accept or manage your choices by clicking below, including your rights to object where legitimate interest is used, or at any time in the privacy policy page. These choices will be signaled to our partners and will not affect browsing data. If you accept, we may do the following, use precise geolocalization data, actively scan device characteristics for identification, etc., etc." Now, I'm sure I'm not the only one to respond to these kinds of pop-ups. I never read them, I always say accept. Now that's not good policy, and so we have an intuitive feel as users this is not good policy. But contract theory tells you that it's not a good idea from an economic efficiency point of view to give an ownership right to the data subject. This goes back to what I was saying earlier that the data that gets produced is a public good. It's a joint production process, and it's not right from an economic efficiency perspective to give ownership to just the data subjects.
Long Chen (10:27):
Why would people be so obsessed with ownership of data? Do you think it is possible to solve the problem data through the ownership issue? Is that ever possible?
Patrick Bolton (10:42):
I would maybe probably in the interest of being very clear and maybe overstating slightly in my case, I think that the idea of ownership of data by a customer is a bad idea. And that instead I would advocate in terms of protecting the customer, protecting the good use of the data. That it's better to think in terms of fiduciary duties of the person or entity who has control of the data. Then the fiduciary duty is really, first, to the customer, to the data subject, but to society at large as well.
Long Chen (11:44):
But let me add one question here. When we say data, when you say it's a public good, do you mean that whoever controls the data should or has to share the data with other people? Or the person who controls data have the right to decide whether he wants to share or not so long as it doesn't hurts that the rights of the data subjects?
Patrick Bolton (12:18):
Yeah, that's a great question. I think the simple economic principle is that you want to maximize the value of the data. That goes through data sharing, data sharing is how you achieve a maximum value. But because we're talking about a problem of private provision of a public good, this data doesn't fall from the sky, this data is produced. So you need to align your objective of making optimal use by society of that data with the incentives to produce and maintain the integrity of that data. You combine those two principles and you have it's typically the platform that vendors who produce this information by aggregating it who should have control over it. And, of course, they should make the... because their best place to make decisions on how it should be shared, but they should share and make those decisions with, as I said, again, the key point is that they're the fiduciaries so they should have in mind the interests of the users, the customers, and the data subjects.
Long Chen (13:52):
Right, so who do you think produces data?
Patrick Bolton (13:59):
The raw input is the search activity and the decisions by agents online, but that's just a raw input. Then the data gets produced by the vendors, the data providers who assemble it, standardize it, maintain it, and make it a minimal to analysis.
Long Chen (14:28):
Is it fair to say the following, actually, the users, the consumers, they do what they intend to do without the intention of producing data. So the data, the information or the data about what they do is actually being produced by observing and recording what's going on. That is different from the activity itself.
Patrick Bolton (14:58):
That is an excellent point. I think there's a lot of truth in what you've just said, and to underscore the implication. An individual doing online search, looking for a particular transaction, generating a lot of data from this is not able to understand the value or implications of all that data that's generated. So putting that individual in a position to have to make decisions on how that data is used, giving consent protections is highly problematic because you ask, "Well, do you agree to use this data for purpose X?" But how does the consumer know, first of all, what purpose X is, how that information will be useful, etc., etc. Most people have no clue.
Long Chen (16:03):
If we were to think about data as a public good, it is produced by somebody observing activity but it cannot be separated. It's not independent of the activity itself. So that means it might benefit or hurt the interest of the subject they observe. That's why we naturally think about the consumers must be part of this. But, actually, they are not the ones, particularly, that are producing the data. Now, a lot of people would still argue that the consumers, because they are part of this, they should benefit from this somehow. How do you think about this?
Patrick Bolton (16:58):
Yes, absolutely. The consumers, because they're part of it should benefit, but the consumer is not put in a good position to decide what's beneficial to the consumer at the time when the transaction takes place. It's too complex a problem to absorb, so you need to delegate the responsibility of acting, of using data in the interest of consumer to the person best placed to make those decisions. The person best placed to make that decision is the vendor. This is what the fiduciary law allows you to do. For example, so what is the basic, the most extreme example of where fiduciary law is extremely helpful? It's when you have, for example, someone in your family, because maybe they're old or they're ill, they're no longer in a capacity to make decisions for themselves. So then they appoint someone else in the family as a fiduciary who's in a better position to make those decisions. That person, of course, has to act in the interests of the incapacitated member family. That sounds like a really extreme example, but for data it's a little bit the same thing. The vendor is able to understand the value of the data, where it's going to be put to good use, how it can be shared. But the vendor has to act in the interests, is the fiduciary of the customer. So that's why fiduciary law is the right way to think about data governance.
Long Chen (18:49):
Put into the context of the protection here. Maybe on the one hand the consumers, the users, perhaps they have the right to object to what data can be collected. They should be somehow informed in the beginning that some data is collected, and that gives the consumer some options. They have the right to be not part of this. On the other hand, if it's the market driven mechanism, that means the users are willing to participate in this. Then it is up to the vendors, the producers of the data to use it in a responsible way that is an incentive compatible to the consumers and the users. Is that correct?
Patrick Bolton (19:48):
Yes, but I would even say stronger than what you said. I would say that even the idea that the consumer should have the option to refuse information sharing is misguided. Even that is misguided. In a very concrete example, if the data is let's say a medical data to develop a new vaccine, which is going to be a enormous public good, should an individual have the right to refuse sharing information on the medical characteristics that are important to share? To determine how the vaccine works and what the risks are? I'm not sure that is a socially good outcome. The idea that I think we should really think hard about this idea that there are certain types of data that is the complete right for the individual to refuse to share that. But the point is when you give up let's say DNA information and the DNA information is part of what is contained in that information is pre-existing conditions. Now, you don't want that information that whoever collects that data can obtain about an individual. You don't want that information to be shared with an insurance company that is writing an insurance contract with the individual. This is where fiduciary duties come in. In a way that the default is that you share information even without consent. But, in the background, you have the protection of fiduciary law that it won't be abused, that information will not be abused.
Long Chen (22:13):
In my view, one of the biggest misunderstanding of data is that people somehow think data is a standalone commodity. That is can be separate from our economic activities, our interactives with other people. But in my view, for the users' welfare related data, there are two things. One is that there are certain privacy rights should be protected when data is used. But, on the other hand, they also have the right to share the information in order to obtain services or other stuff. But the problem here is that if you really lock up data, you refuse the latter data to be exchanged and to be observed to be used, then there is no service talk about. We cannot talk about the other service without data, so it doesn't really happen the same time. It seems to be that is the key, you cannot separate or cease. Do you agree? What do you think about this?
Patrick Bolton (23:32):
I 100% agree, and this is I think you put it extremely well. This is what I was trying to say earlier when I said contract theory, the unit of analysis is the transaction and not the individual. Exactly, as you said it, the data is not associated with an individual, it's associated with an individual or multiple individuals and a transaction, and it's associated with a history of transacting relationship. It's not like Lego, you cannot separate out the pieces and say, "This is individual and this is not." It's all integrated, and that's why it's a public good. I think this is really fundamental. Now I've lost the train of thought here, but I 100% agree with what you said.
Long Chen (24:28):
Okay. But some people will still argue that because the vendors, whoever uses the data might benefit, let's say, make some profit. Would the users feel right that they should share some of those profits, or maybe in action itself because it's the wheeling activity is being somehow they already be benefited from this activity of exchanging stuff. How do you think about the benefits distribution here? Is this a major issue? What can we do about it?
Patrick Bolton (25:14):
It is an issue. The distribution of value created is an issue. One of the key ideas in contract theory with respect to private information is that private information, it can be seen as like a monopoly rent. It protects the owner of the information against like a monopoly, like a monopoly rent. Now we understand that if everyone has monopoly rents, we get inefficiency. So the whole challenge is to allow for value creation and then somehow have a relatively fair distribution of the value that's created. Now here we're really touching on the broader issue of competition. Now in a world where vendors are competing with each other and where they can generate a lot of value by aggregating and managing data. Well, if they're competing, they're going to compete away some of that value to their customers. I think that's how we should think about it.
Long Chen (26:34):
I see. In your view, because of the nature of data which probably should not apply the property rules to it, we should use the liability rules. Can you say a bit more on this? How in similar situations we have used this kind of arrangement, the deals, the issues, make some comparisons so that the audience know better of what you think?
Patrick Bolton (27:08):
Yes. This is a great question, a very big question and the terms that you used, property rule, liability rule, those go back to a classic article by Calabresi and Melamed, which was published in 1972. Where they really formulated the distinction between property rules and liability rules. The best way to understand the distinction is to think of property rules as being an ex ante decision based on consent. Whereas the liability rule is an ex post decision. In some context, property rules are okay, in other contexts better to have a liability rule. Now, I'm simplifying here because, in fact, there's a big literature in law and economics that argues that liability rules are always better. I didn't want to go that far, but to give you an example of why it's important to think of property rules and liability rules, it's often something that economies are not fully appreciating. When you write a contract, let's say I write a contract with you for you to deliver good to me, let's say in a month from now. And I promise to make a payment. I may even make a payment in advance, and then at some point when you're supposed to deliver the good, you don't. You don't because, for example, you realize this good is much more valuable to someone else than me. You're in breach of contract. Now, if we apply property rule, a property rule would say, "You have to give me the good, no matter what." You made a promise, you have to execute it. A liability rule would say, "No, you were right to breach the contract. You sold the good to someone who values it much more, but you then have to compensate me for my loss. You have to pay me damages." That's how all of contract law is based. So then the broader question is when do we make a decision ex ante? When do we make it ex post? What we've just discussed in terms of the use of data, it's very difficult to make a decision ex ante on the right use of data. You want to make that decision ex post when it's better to understand, "Was it appropriate to share the information or not?" That's how you apply the liability rule, and if it wasn't appropriate to share the information, then you have liability, you have to compensate for damages.
Long Chen (30:08):
We're so used to the concept that ownership is still crucial for the modern society. To the extent that we do not, perhaps, do not pay enough attention to the important rules of the liability rules. Do you agree with that?
Patrick Bolton (30:33):
Yes, I agree with that. In particular, in the area of privacy protection, we've gone too far in the direction of thinking of privacy as a property right. There's another confusion with property law and privacy, and this brings me back to the point about the pop-ups that I mentioned earlier. For some websites, for some transactions, the pop-ups, once you open it and you start reading it, it's pages and pages and pages. Now the way property law works, that should not be allowed. For example, if you own an apartment, it's your property. If you registered the title, or it's your title with the registry, okay? And then everyone knows it's your property. Everyone can consult a registry, find out it's your property. But then you ask, "Well, what can you register as property on these registries?" Well, it's only a very well-defined small set of titles and they have to be structured in exactly a standardized way. You apply that idea to these pop-ups, the pop-ups should only have maybe three lines, everything's standardized, and the consent should be about very specific things and nothing else. But the abuse of property rights when it comes to this consent requirement is that there's no limits put on all the things that you need to consent to, all the complexity. It becomes so complex, so difficult to read through it that basically people either give up or they don't take it seriously. I think, in very general terms, I think the best way is to just remove them, not have them. No consent, but have maybe requirements by vendors to put on their website what their duties are with respect to protecting privacy, sharing information and the interests of the customers, and then have maybe monitoring supervision of how information gets shared maybe by a regulator is better position.
Long Chen (33:04):
Another angle to look at this, so there are bigger vendors, smaller vendors, let's say, SMEs, startups or bigger companies. Obviously, we have different abilities for the data protections, stuff like that. If the restrictions, requirement, the standard's too high, so the startups, SMEs, they don't have a chance to make use of data at all. They cannot get into the data driven business. Then it's actually the big companies that might benefit more from this. From the contract theory point of view, from the liability rights rules point of view, how do you think we should arrange this?
Patrick Bolton (33:55):
Yes, that's a great question. I think data becomes a form of... it creates a barrier to entry, as you put it, and becomes a form of it poses switching costs on the customer. That could distort competition, and you need to balance the forces of monopolization here with the forces of preserving incentives for data creation, for data use, and so on. Here we're really at the intersection of antitrust law and other law like that, "Well, we've discussed data protection, fiduciary law and so on." Now to the extent that incumbents are sitting on so much data are able to deploy it so effectively that they become formidable competitors, you worry are you in a way killing competition? And by killing competition, are you killing innovation? Now those questions are antitrust questions, and there are antitrust principles that can be applied to the situation. Now I think that's a very fast moving area. I don't want to say too much here because I think antitrust has not caught up with the fast development of big tech, all the issues that are around it and how to apply antitrust to those areas. I think there's a big modernization of antitrust law required to apply to these areas.
Long Chen (36:15):
Coming back to your idea of the liability right now across the world in Europe, in US, in Asia, China, many countries, they are trying to step up the privacy protection through laws and regulations.
Patrick Bolton (36:36):
Yeah, that's right.
Long Chen (36:38):
For example, in Europe, we have GDPR, in US, we now have CCPA. How do you think this regulation attempts, how effective are they? What are the things they can improve?
Patrick Bolton (36:56):
Yes, so it wouldn't surprise you if I say I don't think they're very effective because I think they are premised on the wrong assumption that it's a property right protection, it can issue their data privacy. That's the wrong premise. The premise should be that it's protection of privacy is not synonymous with protection of a property right. It's not protection of privacy does not require consent, it requires fiduciary duty. It requires a duty of loyalty on the part of the entity who makes decisions about how to share and use that information. I think that's like a very simple idea, but unfortunately neither of your European approach nor the American approach are based on that idea, and I think that's where they're going to run into problems.
Long Chen (38:10):
From my observation in both the GDPR and CCPA, now they do not explicitly define who owns the data. In some sense, it still keeps the essence of the various information principle. But somehow probably it's getting mixed up a little bit in both the liability laws, the laws, and the property rules. They do not explicitly define who owns the data. But then they gave a lot of the benefits, the rights to the users or additional data subjects. The information is about for them to decide a lot of things. That implicitly is somehow was the assumption of some property rules. There's some mixed use of this. Am I right to say this? Because I think any country, no country, actually, specify to say that, "Okay, data belongs to the original data subject." Their activities are recorded, it does belongs to them. I think all the regulators realize that it doesn't work this way. But then they want to apply the property rule somehow, and it varies by how much they go. Am I right to say this?
Patrick Bolton (39:50):
Yeah, this is very well put. I think that's exactly right. Property of data, who owns the data, is not defined in the law, for the reason you said, it's that it's impossible to separate the data created from a transaction, from the transaction itself and the individuals involved in the transaction. It's just to say, again, coming back to this example, or maybe a bad analogy, but the Lego, it's not like a Lego piece and where you can say one piece of Lego belongs to their customer, another piece belongs to the firm, and another piece belongs to some other... You can't do that with data, it's once the data is created, you cannot reverse engineer how it's created and separated. So the law is silent on who owns data for that reason. I think basic technological and conceptual reason, but then failing to say who owns what, the law tries to overlay this false sense of security on privacy protection, data protection by giving consent, very broad consent to the user. The reality is, and the data subject, the reality is the data subject has no idea. They don't know what consent implies. What do we get? We get that everybody consents because they know otherwise they cannot transact. The consent has very little meaning, and then what we have is that we have some bad vendors who abuse the data, who share it when they shouldn't be sharing it. Then they say, "Yeah, but the customer consented. I was given the right to share it whichever way I wanted." I think that's a very bad situation.
Long Chen (42:04):
According to your view, the proper understanding of the contract theory will play a very important role in the digital age, especially, at least in the data governance issues. What do you think that... would you agree with this? There's a lot of research and thinking to be done to think about how to apply the right contract theory principles in this regard. Seems to be what really lacking a hand. What would you say to, let's say, the future researchers who should work on this? What should you advocate?
Patrick Bolton (42:47):
Yeah, so what you said in terms of applying the principles and understanding, get a better understanding of the digital economy from that application, that's clearly a fruitful area. But I also think that contract theory can just develop in a vacuum on some other planet. The contract theory has to be informed by what's going on out there in the world. In particular, how the world has been changed by digital economy. There I see huge potential. For example, what we were saying about how digitization transforms data, and in particular, processing, communication and storage. Now, one particular piece of data is the contracts themselves. They are now most in digital form, they can be stored. So you can imagine databanks of contracts, billions of contracts applying to all kinds of situations. That's a goldmine for contract theory. Being able to look into that, see what contractual terms work well, where a contracting part is inundated in a fruitful way, what kind of contracts give rise to litigation? What contracts open up fruitful relationships, help align interests? That's a goldmine out there. Also contract enforcement, and here I think one example that really struck me was the first report, Luohan Academy reports. And that had to do with how I believe Alipay or Alibaba, I forget now, how they handle contractual disputes with customers. The idea is to have a jury system and to have some adjudication and a quick resolution of disputes. I think the average time's like a week, something like that. Correct me if I'm wrong. What was amazing to me and was complete revelation is how everyone thought this was fantastic. All the contracting party said, "Finally, we have a dispute resolution system that works quickly and efficiently." Now, how would you have discovered that without access to this experience?" I think there's just so much to learn from digital economy on contract theory.