It's not hard to find original conversations about the dangers of "Big Tech", but it is rare to find original solutions. On this episode, we sit down with renowned professor and author, Francis Fukuyama, who has developed a fresh answer to the question of how to rein in the big digital platforms like Facebook, Twitter, and Google.
Francis Fukuyama: Even if you did succeed in breaking up these companies, they may simply re-form again. That’s one of the reasons I don’t think that traditional antitrust would work.
Bethany: I’m Bethany McLean.
Speaker 3: Did you ever have a moment of doubt about capitalism and whether greed’s a good idea?
Luigi: And I’m Luigi Zingales.
Bernie Sanders: We have socialism for the very rich, rugged individualism for the poor.
Bethany: And this is Capitalisn’t, a podcast about what is working in capitalism.
Speaker 6: First of all, tell me, is there some society you know that doesn’t run on greed?
Luigi: And, most importantly, what isn’t.
Warren Buffett: We ought to do better by the people who get left behind. I don’t think we should have killed the capitalist system in the process.
Speaker 7: The House Judiciary Subcommittee on Antitrust releasing its long-awaited report on big tech.
Luigi: After a 16-month-long investigation, a subcommittee of the House Judiciary Committee issued a very long report.
Speaker 8: And it finds that Amazon, Google, Facebook, and Apple have engaged in anticompetitive behavior.
Luigi: Outlining what can be done to rein in the market power of digital platforms.
Speaker 8: And they issued some recommendations, notably structural separation of these businesses.
Bethany: I think this conversation about what we do about big tech’s power is one of the most important conversations that we can have. The real danger of the power of these platforms is not that they distort markets. It’s that they threaten democracy. So, that’s the conversation we wanted to have on this podcast.
Luigi: And, for this purpose, we’re going to bring in a very illustrious guest, Francis Fukuyama. He has written a lot of books. The latest one is actually one on identity politics. But the reason why we brought him here is because he just chaired a committee that released a white paper on platform scale. Welcome, Francis.
Francis, what makes you so worried about the scale of the platforms?
Francis Fukuyama: The issue of scale is related to their control over political speech. The platforms do not simply transmit what people post in a neutral fashion. They have this unbelievable ability to either accelerate or suppress different kinds of content. That means that the old rules about free speech, where you just leave it up to a free-for-all market of ideas and see which ones rise to the top, doesn’t really work in an era of gigantic platforms like Facebook and Google. What you’re served on your feed, it’s not evident why you’re getting the kinds of information that you are. They’re all being determined by AI algorithms that are feeding stuff up to you based on the platform’s self-interest, and not necessarily based on some idea of what is good for public discourse in a democracy. I think that’s the core of the problem.
Bethany: This topic, obviously, has become increasingly mainstream over the last couple of years. Indicative of that is that the Justice Department finally brought a lawsuit against Google. Did they bring the right lawsuit, in your opinion? Does the lawsuit matter, or is it beside the point?
Francis Fukuyama: The current emphasis really is on antitrust as traditionally defined, which is really looking to remedy economic harms. These are important. Many of them actually probably do constitute violations of existing antitrust law, but they don’t get at the political abuses that, for me, are the most concerning about the power that the platforms have. A kind of classic structural breakup of Facebook may not do any good, because one of the baby Facebooks may end up being the same size as the original Facebook within a few years.
Bethany: There was something in your report as I read it when I came across that line, “baby Facebook,” that just sent a chill down my spine. I don’t know why that concept was so terrifying.
Francis Fukuyama: The other big problem is, if you say you want antitrust to serve political ends, that’s something that can get misused. There’s one area where I actually agree with Robert Bork, one of the founders of the consumer-welfare doctrine, which is that the political use of antitrust had been to protect small producers against giant corporations. If you get back into that business, essentially, you’re putting judges into a situation where they have to allocate a consumer surplus between an inefficient small producer and a large, efficient corporation. And how in the world you make that decision fairly, I think, is a really complicated question.
The other big issue is politicization and capture. We’ve already seen the Trump administration use antitrust law, and, I think, abuse it. They were trying to block the AT&T-Time Warner merger because they didn’t like CNN’s coverage of themselves. Once you get into the politics side of antitrust, there’s actually no guarantee that it’s going to be used for what we would call good, democratic political ends. There is a long tradition in modern democratic theory that says that you can’t just trust the good intentions of the current power holder. You may like the president. You may think that the president is doing all the right things that you would approve of. But you don’t want to give that president unlimited authority, because maybe tomorrow you’ll have a different president whose policies you don’t approve of.
Luigi: So, you don’t want to nationalize Google. You don’t want to use antitrust. What do you want to use to keep its power in check?
Francis Fukuyama: We have a proposal for something we’re calling middleware. Middleware is software that sits on top of the existing platforms and basically serves as an interface between the user and a platform. There’s a light version of this and a heavy version of this. The heavy version would have the middleware actually be the gateway to the platform’s content. So, the platform essentially becomes a dumb pipe that simply provides information, and then the user experience is completely shaped by the middleware.
The light version is where the platforms continue to do what they do. They serve up the content they want to serve up. But the middleware behaves like Twitter is behaving now. They tag things. They say, “This is controversial,” or, “This does not check out,” or, “Please go to this website for further information about this point.” So, those are the two extremes.
In our report, we don’t really take a position on which of these would work. I think the first alternative is going to be fiercely resisted by the platforms, because it really undermines their business model. The second one, I think, probably gives them a little bit too much power. It’s better than nothing, but what we think is that there is perhaps an intermediate position where the middleware company can basically act like knobs and dials that an individual consumer can turn if they want certain kinds of content delivered to them. That would give power back to the individual user and take it away from the platform.
Bethany: You say that one of the arguments against breaking up the companies is that it would take a very long time. But wouldn’t the process of getting them to accept any kind of middleware solution that would actually do anything take a very long time as well? In other words, how do you get these platforms to accept something that is going to undermine both their economic model and their influence, even if, on some level, they don’t want to be the gatekeepers of free speech?
Francis Fukuyama: The hope lies in your last point. As far as we can tell, most of the owners of these companies really don’t want this responsibility of being the arbiters of appropriate political discourse in the United States today. Jack Dorsey has said that pretty explicitly, and you can see why. I mean, when they decide to label something that President Trump says as not verified or inaccurate, it gets half the country pissed off at them. So, I think that they’ve actually got an incentive to hand this function over to somebody else that could be blamed for making those kinds of decisions.
The heavy version of middleware is something I think they are going to resist pretty fiercely, which is why I think there probably needs to be some intermediate version that allows them to exercise basic control, to continue targeting advertising for commercial purposes. But it takes a lot of that power away when you get to things that are more directly involved with democratic politics.
Luigi: Well, let’s talk about the light version. Why do you think the light version will help? First of all, it’s not that different from what Twitter is doing right now. Maybe it relinquishes a little bit of the power, because now it’s Twitter that just says this is unverified. That now you can choose the verification. But we know that on the internet, most people are lazy, so they’re going to go with the default. So, I see very little changing.
Francis Fukuyama: Well, this gets back to the suit against Microsoft that made Internet Explorer the default browser on their operating system. That antitrust case forced them to back down on that. So, let’s begin just with, I think, the observation that you couldn’t do this voluntarily. There would have to be some kind of a statutory change made that would force the platforms to open up their APIs to make this technologically feasible. I think that, at that point, you would have to also have some government oversight and regulation to make sure that there’s an appropriate business model to give incentives to these middleware producers to actually turn over this kind of power.
Luigi: But my concern is that under the light version, I don’t see the business model of the middleware. Where do they get their revenues?
Francis Fukuyama: You would have to actually force some kind of sharing of revenues, of advertising revenues, with the platforms. This is obviously something that they’re going to resist to the extent that they could. I mean, obviously, the degree to which those revenues would be shared would depend on the degree to which these middleware companies actually added value. I think that the light version probably doesn’t add that much value. I think, for example, that Twitter’s labeling of tweets probably, at this point, is discounted by everybody who looks at them, because conservatives now have it in their heads that the big tech companies are all biased against conservatives. So, they simply disregard that.
But I think you need to think about how middleware can actually play out in practice. Where, for example, you could have a consortium of universities that got together, established a nonprofit organization that, for example, would filter internet content for students and academics. It would guide people to reliable sources of information that they could use in academic research. In that model, it would be costly, and it would be a big undertaking, but you could imagine foundations and other philanthropic sources actually paying for this kind of service to be performed. Just like ProPublica is a journalistic resource that’s founded by private donations.
In other cases, you can actually imagine more commercial incentives. I mean, although we’re focused on political discourse, you could imagine a service that would say, OK, you’re shopping on Amazon. Rather than Amazon serving you up a menu of products to buy, you can turn the dials, and if you want to buy only American products, you can set that setting. If you want to buy ecofriendly products, there’s another dial that you can push that will give you that. That’s a service that consumers might actually be willing to pay something for.
Bethany: I have two questions on this front. The first is a cynical point of view, but if what makes the business model of these companies work is the crack cocaine, the outrage, all the things that middleware is—not all the things, but some of the things—that middleware is trying to solve, does the concept of middleware just fundamentally undermine the whole thing from the beginning in that it’s no longer going to be fun?
Francis Fukuyama: Well, the outrageous content and the clicks have come under a lot of scrutiny. So, it’s already the case with these platforms are pulling back from simply blindly promoting as much of that stuff as they can. I think right now they have to be pretty careful. Having these kinds of filters may actually increase the value of being on the platforms for people. I mean, I certainly think that if I could actually control what the platforms showed me, I would probably use them more, because I don’t like the idea of being manipulated. So, there’s some possibility that it could actually increase their popularity and revenues. These are all empirical questions that we don’t know the answers to right now, but I think it’s not obvious that this is an unworkable idea.
Bethany: That’s a nice, idealistic view of human nature, so I’m going to go with it, because it makes me feel good today. The second question, you explicitly address in your paper that this doesn’t solve politicization, and perhaps nothing actually can. But if this draws us to choose middleware providers that serve us up the content we already want, maybe that’s still better than the status quo. Is there anything out there that can get us to have a national conversation again, or are we just past that point?
Francis Fukuyama: There’s a forlorn hope that somehow, we can return to the age of Walter Cronkite, where you have a single trusted media source. A little bit like some of the existing European public broadcasters, BBC or ZDF in Britain or Germany, that would actually get everybody on the same page and would be regarded as trusted and authoritative and the like. I’m afraid that given where American politics is today, that’s just not going to happen. I think that we’ve seen depths of polarization, especially in the period since November 3rd, that are very discouraging. I just don’t believe that there’s going to be a way to get everybody to agree on things.
Frankly, I don’t think that should be the objective of public policy, because we do have the First Amendment. We do believe that if people want to believe some crazy story about Hugo Chavez manipulating the election, they’ve got a right to do that. What they don’t have a right to do is to amplify or to artificially suppress certain types of information. That’s really where the platform scale really matters, where these platforms really do have that power.
What you’re going to get is, yes, people will retreat into filter bubbles. For every university consortium that directs their users to reliable sources of information, there will be crackpot, there will be a QAnon filter that will direct to QAnon sources. But I just think it’s unrealistic to think that you can stamp that out, and I don’t think we ought to be aiming for that. Furthermore, I think it’s technologically impossible.
Luigi: Yeah. My concern is actually the power that these platforms do have today to suppress information. We have seen in the case of the infamous Hunter Biden story that many platforms decided not to link an article from the New York Post. You might believe or not believe the New York Post, but the fact was hacked information did not prevent people from publishing the Pentagon Papers in TheWashington Post. This arbitrariness in deciding what is the information that’s worthwhile and what is not, I think, scares me a great deal.
Francis Fukuyama: Well, that’s exactly right. I’ve seen people that I really respect and rely on say that they’ve been kicked off of Twitter for something that they posted that, when you look at it, it looks quite innocuous, but Twitter was just interpreting it in a very strange way. Again, this is why I just don’t think that these kinds of decisions should be left in the hands of these private companies. The more that you can return that power to the actual consumers of information, the better off we’d be.
Luigi: Wouldn’t the solution be to eliminate the editorial power? If I post on Twitter and you follow my Twitter, you choose whether to receive it or not. Now, they promote Twitter. Let’s forget advertising, that’s a business model, but they promote Twitter to keep you on the platform longer. They, on purpose, promote the most provocative Twitter, as Facebook promotes the most provocative posts. So, why don’t we allow people to share? In the past, if I was crazy and I was sharing with my friends my views, nobody prevented me from doing that. If, with a phone, I called my friends and I had my QAnon crazy conspiracy theory, the phone company was not filtering that, but it was not promoting it. So, we allow the platforms to actually share information but not push information.
Francis Fukuyama: No, that’s exactly right. I think the things that you want to target are those places where you’re not just posting and looking at your friend’s posts. You’re actually getting something that the platforms have cooked up. Every single one of them does that in one way or another. You would definitely want a middleware company to take over.
Bethany: I’m going to be cynical once again, but it seems to me that as much as the companies may not want to be in this business, they also are not going to want to give up any economics. Perhaps more to the point, their shareholders are not going to want them to give up any economics. How do you reconcile that shareholder issue, and have you heard any feedback on this front?
Francis Fukuyama: No. So, this is what we’re planning to do in the coming weeks. Take this idea on the road and just get that kind of reaction, because, as I said, there could be reasons why even shareholders may see this kind of service increasing the value of certain of the offerings that these companies have, because people are pretty fed up with the way that they’re being served this information right now. But, look, I’ll be the first to admit, this is an empirical question. I’d like to speculate about it, but until we actually road-test some of these ideas, we don’t know what the answer is.
Luigi: If you believe that this is an exercise of monopoly power, there is no doubt that regulating it will make shareholders lose money. That’s, in a sense, I won’t say the purpose, but it is a natural outcome of any antitrust enforcement. In fact, I will even claim that if you don’t do so, you probably are ineffective, because you’re not really aiming at the problem. My preferred solution would be regulation to say the platforms cannot be in the business of editing. Somebody else separated from the platform can be in the business of editing. I’m a big supporter that if you have to do regulation, the regulation must be as simple as possible, because the more complicated it is, the more the incumbents are going to manipulate it to their own advantage.
Sometimes it’s better to be sharp and aggressive rather than being very sophisticated in fine-tuning, because then the fine-tuning is completely undone in the regulatory phase. It has happened in the financial industry with the so-called Volcker Rule that wanted to separate activities. But it ended up failing and being very costly in the process, because a lot of people are lobbying in one direction or another.
So, if you are suggesting that what we should do is a simple regulation, say, from now on, if you own a digital platform, you cannot be, in a sense, the market player and the arbiter at the same time. So, you can be a market player, you can provide the platforms, but you cannot be the one editing for your customers. An independent company should do that. Do you agree on that line?
Francis Fukuyama: Yeah.
Bethany: So, I agree with you both, and I love the analogy of the Volcker Rule. I think that’s really interesting. I was just being the practical journalist as opposed to the theoretical economist, in that even if I agree with you 100 percent, the path from here to there is determined by feasibility. If you have companies that have to give up economics and shareholders that have to agree to the giving up of economics and legislation that has to be passed in what is still likely to be a fairly polarized Congress, as seen by the fact that the Democratic report on the tech giants and the Republican lawsuit, the Trump-led lawsuit against Google are not actually so very different, but neither party will even sign off on what the other one is trying to accomplish, even though their end goals are not actually that different. So, my comment was made from a feasibility standpoint, not from an ideological standpoint.
Francis Fukuyama: But on the feasibility issue, this is one of the areas where there is some degree of bipartisan consensus that neither the Republicans nor the Democrats really like the idea that these platforms are exercising this amount of power. Today, I wouldn’t bet on bipartisanship in almost any aspect of American policy, but this one has got a little bit better chance than some of the others.
Luigi: But I see a chance with Facebook, because Facebook has irritated enough Democrats that maybe they’re going to go after it. I don’t see regulation of Google or Twitter coming anytime soon, especially from a Democratic administration. If you look at the amount of donations from Silicon Valley to the Democratic Party, the friendship, if you see kind of even who has been appointed to the transition team, I don’t see sort of a radical agenda there. I see more a celebration of big tech rather than reining in big tech.
Francis Fukuyama: Yeah, although the grassroots get still quite exercised about this, particularly the progressive wing of the Democratic Party, Joe Biden, I don’t think, can look like a tool of these big corporate interests. He’s got to give them some victories. So, maybe this will be an area where he can offer them something, but I don’t know. The politics of this is obviously going to be very, very complicated. There’s lots of reasons why corporate America in general doesn’t get regulated very heavily, because they influence everybody.
Bethany: Per your point that, Luigi, you come back to over and over again is that Google is one of the biggest corporate spenders in Washington. The company spent almost $13 million lobbying in 2019. So, the idea that big tech is apolitical in any way, shape or form, these companies became very aware that Washington was critical to their business, and they have acted on that.
Luigi: The most dangerous combination is if, all of a sudden, Google and Facebook were to embrace a Trump or Trump-like person. That would really lead to a concentration of power that is pretty scary. Except for the ideology of their employees, I don’t see anything in between that. There is a marriage of interests that is pretty strong.
Francis Fukuyama: Well, that’s exactly right. Then you can’t say that it’s impossible that actually a Rupert Murdoch or a Julian Sinclair Smith will not buy Twitter at some point. Trump himself is thinking of starting a new media company. If he’s smart, he’s actually going to try to get a social-media platform for himself for all of the disaffected conservatives. The ownership of these companies could put us in a pretty politically dangerous position.
Bethany: I think that’s why the key line in the report for me was, “The question for American democracy, however, is whether it is safe to leave the gun on the table.” Because Democrats may think today that the gun is their weapon of choice, but given the radical swings in US politics and how quickly things change, you could quickly imagine that gun being pointed in the opposite direction. The gun on the table is never safe, right, even if—
Francis Fukuyama: That’s right.
Bethany: —you think it’s your weapon. It’s not.
Luigi: Yeah. This is true for real weapons. It’s, of course, also true for metaphorical weapons. I think that we should all appreciate the analogy. I think it’s a beautiful analogy.
Francis, anything you want to say that we forgot?
Francis Fukuyama: I’d just like to say that we are going to be rolling this idea out. As you can tell, there are a lot of undetermined aspects to this, like exactly what middleware would look like, what kinds of powers it would have, how you fund it, how you implement it technologically. We don’t have answers to any of this.
So, what I would like to see is people taking up the idea, thinking about it, discussing it, really seeing whether this is a viable way forward. Because the other thing we do in our report is we go through all of the other tools that are out there: traditional antitrust law, privacy law, data portability. We just don’t think that any of these is going to be adequate to solve the political problem that we see, that loaded-gun problem. Therefore, if this would work, we would really like other people to help us figure out how to bring that about.
Bethany: There were a couple of really critical things that I had not thought about before. I do think this linkage between the economic power that these companies have, the power they have to influence conversation, and their possible threat to democracy is a really important one. It’s been floating around out there before, but I haven’t seen it made in quite as concise a fashion as this report did.
Luigi: No, absolutely. But historically, the idea of democracy was intrinsically linked to people who owned a piece of land. Why? Because there was the ability to work independently.
Bethany: Can you imagine if they gave most of us a piece of land today? I think I’d starve. How about you guys?
Luigi: Me, too. But the point here is that for a long period of time in the United States, the land metaphor became a shop, some other small jobs that made you independent from concentrated power. My hypothetical question is, imagine that everybody was an employee of Amazon. Could we have a democracy if everybody is an employee of Amazon?
Bethany: Absolutely not.
Luigi: Exactly. Now, pushing this a step forward, imagine if you’re an employee of either Amazon or Google or Facebook. Can we have a democracy if everybody is an employee of these three?
Bethany: Absolutely not.
Luigi: So, I think that this is where economists and political scientists collide. The entire field of economics has been raised with the myth of efficiency of all costs. This is saying, you have to sacrifice efficiency for a bigger principle, which I’m very willing to do. But, of course, on the other extreme, you don’t have a democracy if everybody works for Amazon. On the other hand, having a democracy when everybody is starving maybe is not the ideal state of the world either. So, you want something in between. Now, one of the beauties of all the increasing productivity, et cetera, that we had is that you don’t need to starve to have a democracy. Being a democracy and being a little bit less wealthy, I would take it over being very wealthy and not being a democracy. But maybe other people have a different trade-off.
Bethany: Where do you stand on the concept of middleware relative to other solutions that might rein in the power of big tech?
Luigi: So, the version that emerged in our discussion, which is a version that I basically call regulation, I’m very sympathetic. If you’re basically saying to companies, you are not in the business of editing because you’re monopolies, and editing is too dangerous. You are only in the business of putting people in contact, which is very valuable, and you can get all the information. And you can push all the ads that you want, commercial ads, but you don’t push the tweets or the posts of x, y, and z. That’s something that is prohibited. But because people might be overwhelmed with the amount of information, they can buy the editorial services from some independent providers. This requires, first, a massive amount of regulation, and after that the middleware will arise. I’m not so sure that that’s exactly what he has in mind, but I think that that particular solution is feasible if this regulation is feasible.
Bethany: It makes me realize that there’s a lot I don’t understand about the world, but one thing I definitely don’t understand is what advertisers are actually paying for and whether it’s even possible to disentangle these two things. In other words, does a company like a Google or a Facebook lose the great majority of its advertising revenue if the algorithms can no longer be used to control what’s put in front of people? In other words, are these two concepts so inextricably linked that you actually couldn’t do the surgery to take them apart without completely taking away these companies’ revenue base? In which case, there’s no way they’re going to agree to it.
Luigi: I’m not so sure I’m much better than you on this front, but my understanding is, of course they’ll lose some, but not 100 percent, because I will still know that you, Bethany, like dogs. I am going to sell you a lot of dog products. That’s very valuable to the advertisers, because they want to know that you love dogs and you’re not a cat person. So, I think that that’s still possible without pushing a lot of posts that even your friends make. The interesting thing is that these digital platforms claim that they do it for you, that you have too many friends or too many people you follow on Twitter. So, they select what they think is better for you.
In doing so, however, they very much play an editorial role. First of all, in my view, they are in violation of the idea behind Section 230 of the Telecommunications Act that says that they’re not responsible for posting this stuff because they’re not editors, because, in fact, they are editors. Two, I think once they become monopolies, the editorial role is the role that every newspaper plays in the history of newspapers. But these newspapers tend not to be monopolies. If they are monopolies, you want to reduce that role.
Bethany: I think that’s a fair point, but my point was that the line between a dog and idea, while it’s very obvious when it’s a dog and an idea, it may not be so obvious in execution. In other words, why is it OK to put an ad in front of me that is a dog product I might like versus an idea that I might like? How do you draw the line between those two things? If you are drawing the line between those two things, how much do you fundamentally undermine the companies’ business model?
Luigi: I think there is a very clear line, which is the difference between an ad and a post. Facebook has been very, very good at basically confusing people’s minds and blaming everything on Russian interference. But the problem is not Russian interference, it is actually the mechanism of pushing these posts, and this is the business model of Facebook. So, take away political ads for a second, then we’ll come back. If you are doing an ad about a dog, about even a new movie, whatever. It’s a paid ad. It shows up as a paid ad in Twitter, on Facebook. You know what it is. Fine. What I don’t want is Facebook to suggest I buy something, or I look at a post, and I don’t know whether this is an ad, or it is a post of a friend, or it is a post of a friend that is used as an ad by Facebook.
Bethany: So, if it were a paid ad for an idea, you would be OK with it. In other words, if what you were being served up by Facebook was a paid ad for, say, some kind of new regulation, or a paid ad for, I don’t know, dismantling big banks, you would be OK as long as it was labeled as paid.
Luigi: Yeah. Exactly the same way in which, in the old days of TV, there were ads. TV was a monopoly. It was an oligopoly, but there was advertising on TV. People knew when it was advertising. The rest of the content was regulated in a different way. I think that I would like to keep the same distinction.
Bethany: OK. I actually think that’s fair. I wanted to argue with you, but I’m going to agree.
Luigi: No, no. The more delicate problem . . . You let me off the hook too easily, because the delicate part is the political ads. So, do we allow political ads? As long as they come in as a political ad and you pay for every click, that’s fine. The problem is that many of these things get repeated and pushed independently, because even the so-called Russian ads, I don’t think they paid to reach so many people. It’s that then they were repeated and pushed through repetition.
Bethany: Right. Well, there also is the whole notion of paid influencers. Where do they fall in this bucket because they are paid? They are paid ads, and yet they don’t always appear as such, because they appear as something you like from the person that you really like. Yet, in a sense, it’s paid. So, that’s why I think it might get a little harder to draw a line between these things in practice than it appears on the surface.
Luigi: Yeah. I think you’re absolutely right, but first of all, it is a bit disturbing that influencers do promote stuff without saying they’re paid for it. But, again, if I choose to follow you, Bethany, I choose to follow you. So, I will never see your Twitter or your Facebook post unless I’m a follower of Bethany. So, in a sense, you choose your poison. You choose an influencer that sells you crap, at some point you stop following them. But I don’t get exposed to your ideas if I don’t choose to.
Bethany: Yeah. I guess that actually is a broader way of summing up Francis’s position, which is that it is pick your poison. You get to have your poison. We’re not going back to a world in which there is a national conversation and people get exposed to different ideas just by virtue of picking up the newspaper. This is almost, in some ways, a cynical solution for a cynical world, in that you’re going to choose your middleware that is the poison that you want. Hopefully, if you want to challenge yourself, you’re going to pick middleware that is challenging. But it also allows you to retreat to the comfort of your own worldview, which, in a way, I get. I’m not sure there is another feasible solution, but in a way it’s also quite depressing that that’s where we are.
Luigi: Actually, at the conference where Fukuyama presented his white paper, I suggested that if we really think just exposing people to different ideas would make a difference, once you have this middleware, there is nothing that prevents that, by regulation, you are exposed 10 percent to the opposite of what you chose. If you are a conservative guy, 10 percent of the posts should be coming from a progressive and vice versa. I don’t that this is going to fix the problem, but if that’s what you want, it’s easily achieved through this mechanism.
Bethany: Do you think it is, though? What about all the power and trust that we’re putting in the hands of the regulator? Which we are, in general, in many ways, whether the regulator is the Office of the Comptroller of the Currency or the regulator is the Federal Reserve. We put an enormous amount of our faith in somewhat shadowy regulators or regulators whose inner workings we can’t understand. I suppose this is no different. But, for some reason, the concept of a powerful regulator based in DC determining all of this is frightening to me.
Luigi: Yeah, but that’s one of the things I like about Fukuyama’s proposal, because I don’t think he depends very heavily on the regulator. He’s trying to reempower individuals through the freedom of choice. If you have a system that, by regulation, you can’t do your own editing, and individuals are left to choose their own editors, I think this is a system that, of course, it’s a regulation system, because you have a restriction to some kind of organizational form. But, after that, there’s not much discussion of the role of the regulator. The regulator just needs to enforce this separation. That’s the beauty, to some extent, of structural separations, is that you don’t have to fine-tune them.
Of course, it’s very easy to prove that they’re economically inefficient, because on the margin, you can do better with a more-clever system. However, a more-clever system is undone the next day by lobbying, and a structural separation has more chances to resist. After all, Glass-Steagall resisted, what, 60 years, and the Volcker Rule not even one.
Bethany: I suppose that’s fair. I’m a little less sanguine than you are on the topic of regulators, because I think they do tend to take on a life of their own. They are Frankenstein-like, in that they are institutions created by human beings that rapidly become something their creators didn’t envision. And often become quite a bit more powerful than their creators ever envisioned they could be, but I agree with you. More broadly speaking, my biggest problem or my biggest question about this proposal is its feasibility.
His point was that breaking up these companies would take too long. But I think getting them to accept anything meaningful with this proposal is also not going to be as simple as the companies saying, “Yay, let’s give up a big chunk of our power and a big chunk of our economics in order to embrace this.” So, I don’t see why it doesn’t require the same process, either of regulation being passed or of the Justice Department winning some kind of antitrust lawsuit that forces them to accept this kind of solution.
Luigi: I think it does require the same process, but arbitraging for our knowledge of financial regulation, I think that the scary part, in my view, of this proposal is that it can be watered down to be close to nothing. This is dangerous. Why? Because it’s very convenient for any politician to take the name, say that they’ve done something, and move on. I think that that’s where we’re going to go. We’re going to have a lighter middleware Fukuyama style that changes nothing. That’s my prediction.
Bethany: That is a cynical, but I think also a pragmatic, take on where it’s likely to end up, because I don’t see the companies themselves ever acceding to anything willingly that requires them to give up a big chunk of their economics, nor do I see their shareholders going along with it. So, I don’t see how anything that is onerous enough to be meaningful can happen without a lengthy process, because shareholders will sue in order to prevent it as well.
Luigi: Of course. But if you start from the principle, they are too powerful and nothing can be done, of course nothing can be done. So, you need to—
Bethany: That’s not what I said.
Luigi: No, I know that’s not what you said, but I think there is a risk of this cosmic pessimism, they are too powerful, and so nothing is going to be done. Of course, it’s self-fulfilling. We need to try to do something. I’m with you, that I fear the life of a Frankenstein regulator. That’s the reason why I really became a big fan of sharp, clear, simple rules or no rules. Because the soft rules make the fame of economists, the money of lawyers, but they don’t really contribute to improve things.
Bethany: I agree. Taking the conversation in the direction of these companies that are so powerful, this isn’t going to work, really does defeat the purpose, because the point is, and, in my view, the greatest contribution of Fukuyama’s paper is the debate, the ideas, that we’re all talking about this now. If you think about where we were five years ago or 10 years ago . . . Stop. Somebody had the temerity to walk by with a poodle.
If you think about where we were five years ago or even 10 years ago, this conversation was nowhere. It will be interesting to see what happens going forward, because certainly over the last decade, one of the moves in the American system has been that big tech has become increasingly allied with Democratic lawmakers.
Based on the actions that the platforms take, that they are a weapon that the Democratic Party thinks it can control. In other words, the Democratic Party thinks they have the gun, and they can point it at their opponents. I worry that Francis’s very wise point, that a gun is a weapon, and all of a sudden it can be pointed in the other direction, that they’re not going to understand that, because it’s very tempting when you think you control the weapon to think that you’ll always continue to control the weapon.
Luigi: But actually, talking to him, it occurred to me that, ironically, there is a link with a discussion we had about the Friedman principle and basically corporate governance in general. Because the real reason why the Murdoch hypothesis is unlikely to be true is that if you have a Murdoch taking over Google in this moment, I think that Google will lose a tremendous amount of value, because a lot of employees will leave. Even Murdoch is not in the business of losing billions. So, if he has to acquire Google at a very high multiple and then faces a huge capital loss because all the best employees leave, it’s not going to work very well.
The topic that most people don’t fully appreciate is, at the end of the day, shareholders don’t have so much power, that employees and customers might have more power, is manifested in places like Twitter and Google. If they were to dramatically change their political position, I think they would lose most of their value, because the employees would leave.
Bethany: That’s interesting. I’m not sure that’s optimistic in the sense that I would hope that the Democratic Party would understand that this weapon is not theirs to control. So, the argument you just voiced would argue to the counter of that. That perhaps they will believe it is their weapon to control for that reason. I think that would be to the detriment of all of us.
Luigi: Yeah. I think that my argument suggests that it is more likely that the Republicans will regulate Google than the Democrats, but we’ll see.