Henry Farrell is one of my favorite international relations experts, co-author of a fascinating book about “weaponized interdependence” that was the basis for a great event last year at CUNY’s Graduate Center. It’s even more relevant now, with America under new management. And Henry has also been writing some really interesting stuff about the emerging ideologies of Silicon Valley’s elite. So we had a wide-ranging conversation, with probably too many science-fiction references. Transcript follows.
TRANSCRIPT:
Paul Krugman: in Conversation with Henry Farrell
(recorded 7/22/25)
Paul Krugman: : So, hi everyone. Paul Krugman again. I'm talking this week with Henry Farrell, who's a political scientist, international relations. I'm not quite sure what the exact academic description is, but who has been somebody I've known for quite a while, had a fascinating book which we'll touch on some of the topics with Abe Newman on weaponized interdependence.He's also been writing some very interesting stuff, about politics of the tech elite, which was kind of what made me think that I would want to talk to him now. And we're just going to launch in and see where it goes. So, hi Henry.
Henry Farrell: Hey Paul. Great to be on this.
Krugman: Yeah. So before we get to Silicon Valley and all of that, I'd like to talk a little bit so when we had a CUNY event with you and Abe Newman, which was extremely successful on weaponized interdependence, which was largely focused on, and I know you have some stuff coming out that you, we won't try to get into that stuff, but you have some stuff coming out, but the book's called Underground Empire and it's about the influence that the United States has and sometimes uses to shut basically to cut countries off from networks, that there's a lot more power in the hands of the United States than people realize. I'd like to talk a little bit about an update. So first of all, do you have any immediate things you'd like to say, things that you would say differently now?
Farrell: So I think that one of the things that we said in the book, and this is in the final chapter of Underground Empire. Abe and I, we looked towards a future in which we said that, on the one hand, the United States could continue to double down on some of the ways in which it really has been weaponizing the fundamental infrastructures that the world economy runs on.
Or on the other hand, it could look to try and figure out how to do this in a more moderate and more peaceable fashion, looking to the interests of other actors. And I don't think we ever cited him, but one of the people who we were thinking about as we were writing this was Charlie Kindleberger, because Kindleberger who is an economist, and as you say, Abe and I are both political scientists, and Kindleberger has this fundamental notion of what U.S. hegemony ought to be, where he suggests that really what the United States does is it provides a system and order for the world, for the world economy. He's thinking here about monetary system, trade, all of these other things. And he says that the United States can act as a benevolent hegemon in order to try to keep the world system together.
I think when Abe and I wrote the book, we still had this hope that that could happen. And I really think that neither Abe nor I is particularly optimistic about that right now. We can see ways in which you can try to put Humpty Dumpty, bits and pieces of Humpty Dumpty back together again after he's been shoved forcibly from the wall. But I think we are never going to get back to anything that even remotely resembles the age of globalization in the high era of the sort that we even saw perhaps a decade ago.
Krugman: Yeah. By the way, I studied with Charlie Kindleberger, one of the most quotable international economists ever. And we might come back to him a bit, but so you were worried that the United States might not be willing to play this kind of benevolent role. It looks to me as if we've actually seen you worried about abuse or you talked about what looked to you like abuse of U.S. sort of strategic influence. But it looks to me like we're seeing a completely uninhibited version of that right now.
Farrell: So. Yeah. So Abe and I have a piece which came out last week at Wired on the website. I think it's in the next print issue where we talk about this. And we say that effectively, this is what is in the wired, article is called the enshittIfication of American Power, building on an idea from Cory Doctorow, who's a science fiction writer who we both know quite well, and we argued that effectively, the same thing is happening with the United States control of all of these infrastructures and, and platforms.
So if you wanted to be a little bit more academic about it, you could call this hegemonic enshittification. In other words, taking on this role that Charlie Kindleberger talked about and making it into something where instead of trying to maintain the global system in ways that everybody benefits from you, perhaps benefit more than the others, but that is kind of the price that you demand. Instead you start to enshittify this entire system in ways that you more or less grab everything you can for short term profit. So another way you might think about this, I'm a great fan of the TV series, The Sopranos, and there's a wonderful episode, I think in season two where Tony Soprano takes over this store of a friend of his who has gotten into gambling difficulties and they more or less go for broke. Ordering in everything that they possibly can. Maxing out the credit cards in the expectation that this is, that they'll basically be gone before the problems hit. And so I think that we are in a situation which is kind of like that. It is a bust out of American power and American hegemony that is happening right around us.
Krugman: Okay. A mafia bust out of American power. That's a new one. I mean, that's a little bit different from enshittification or it's a different phrase. By the way I'm actually fascinated by the whole enshittification thing and trying to find academic stuff, and it turns out that it really isn't as much as you'd like, but in the business strategy literature, it's actually called penetration pricing. And it's praised, you know, charge low prices, build up your customer base, then squeeze them.
Farrell: There's another economist Albert Hirschman, his book Exit, Voice and Loyalty. There's a discussion there of how it is that economists, by and large, they don't think about product quality.
And so one of the arguments he makes is that one of the great problems of monopolies is that they become lazy rather than they try to maximize prices. And so this is, I think, another way you can think about enshittification. You can think about active enshittification as being trying to squeeze the most that you possibly can out of both sides of a two-sided market. Or you can think about it just in terms of the way that Ma Bell used to be back in the 1960s and 1970s. They just shrugged, rested on their laurels, and observers watched the money come in.
Krugman: Okay. Yeah, definitely. I could go on. It's too much. I'll probably be writing some stuff about this once I figure out what I actually think. Let me just follow on this a bit and because I really wanted to get onto these other things, but I think the role of the United States and the power of and limits of the U.S. are really interesting. So, two of the things that you talked about a lot in Underground Empire were basically U.S. control of the payment system in the world, and then technology. Control of a lot of key technologies. And I'm wondering whether you're rethinking a bit, whether the United States has quite as much power there as the book seemed to say.
Farrell: It's really interesting and I think we're in a big moment of crisis and possible shift happening at the moment. And here, let me just say, I'm not a global financial economist. I'm not a Barry Eichengreen or a Pierre Gourinchas. And I think that there is a lot about the relationship, for example, between reserves and the kinds of uses of the dollar that Abe and I pay attention to, which is more thinking about it as a transactional platform in a certain sense that certainly I don't know. And that I think that we don't necessarily have a complete handle on the politics of it. So I think that there's some real politics there that we're trying to figure out.
But I would say that we are seeing challenges to the United States power and domination, which may be coming from a couple of different places. You know, so first of all, if you think about the U.S. dollar clearing system, which is what we argue is super important. The U.S. dollar clearing system is more or less a system in which international transactions, international trade, can be consummated using the U.S. dollar. So say for example, you are an exporter in Japan who wants to sell something to, I don't know I'd say Iceland, but I can't remember what the Icelandic currency is.
Krugman: It’'s a kronor.
Farrell: Okay. Let's say it's a kronor. There is going to be an extremely illiquid market between Japanese yen and Icelandic kronor. They're not going to be all that many people who want to use one or the other.
So more or less, what you do is you either you have the transaction denominated entirely in U.S. dollars or else you sort of dip in very briefly. So you turn your yen into dollar and then you turn your dollar into kronor. And this then provides the United States with an extraordinary amount of power because in order to do that, you have to dip briefly into the dollar clearing system, which is effectively run through a series of United States regulated institutions. And the United States began to figure out, post September 11th, how to turn that into a means of really getting everybody to adhere to United States sanctions, to implement sanctions because every international bank wants and needs access to the U.S. dollar.
But if we're in a world where, for example, and this is something that you touched upon in your newsletter this morning, of CBDCs, of Central Bank Digital Currencies. We see a variety of private efforts to try and build this up as well through various private sector efforts to try to clear transactions. Then maybe the role of the US dollar becomes a little less important than it has been in the past, and maybe other actors begin to try to dip in and to figure out are there alternatives that they can create? And when it comes to technology, and so when it comes, for example, to a semiconductor supply chains, I think that there is a real trade off, which I think we did talk about a little bit.
Other people like Ali Wyne have written about in Foreign Affairs, which is that if, for example, you're trying to use this chokehold, which the United States has done against China, you effectively deny China access to high-end semiconductors in order to push back on their efforts to develop top-quality AI. You can perhaps do that for a while, but you obviously incentivize China to try and figure out, on the one hand, can it build its own alternative structures of building semiconductors? And on the other hand, if there are threats that it can deploy against the United States in return. And this is what we've seen happening over the last few weeks, where the Trump administration has effectively made some substantial concessions to China in return for China not using its rare earth stranglehold against U.S. car manufacturers, plane manufacturers, and others who need access to Chinese minerals.
Krugman:
Okay, so let me try to do this quickly. Just on the dollar system and all of that. So in the early days of the Russia Ukraine war, there was a lot of talk, Well, Russia will be cut out of SWIFT, which is the payment settlement system for international stuff, which they were, I think, and that this would be catastrophic and
they found workarounds. I wonder if we should rethink obviously people in that business think of the way they do things as being absolutely crucial. But are there slightly more costly but still workable things that diminish that impact?
Farrell: So there are, and I think that you see the same thing happening with Iran, and this is something else we talk about in Underground Empire. So that Iran is cut out effectively of the global financial system. The United States had allowed for Iranian transactions to briefly touch the U.S. dollar. Then it begins to get rid of that so that effectively bans international banks from having anything to do with Iran. And Iran figures out a series of extremely messy, somewhat complex, somehow painful workarounds. And after a while it has built its own kind of shadow system, which is messy, which is costly, which is a real pain to actually sort of engage in. But nonetheless, I'm sorry, this is not the death sentence that it was for the Iranian economy that people thought.
And you see similar things happening with respect to Russia. So at the beginning you see a lot of people talking about how it is as you say, that this is going to be a death blow. And this is the combination of SWIFT and some measures against particular Russian banks as well. By the end of the Biden administration, you're seeing people like Janet Yellen saying, Well, so really what we're trying to do, we're not trying to deal death blows.
We're trying to impose friction upon the system. And so I think none of these systems are actually perfect. None of these forms of weaponization are perfect. and the more that you weaponize, and this is something that Abe and I said in the original 2019 article, the more that you weaponize these systems, the more incentives you give to others to figure out workarounds and what may be extremely powerful and devastating in the short term, and even the medium term is probably not going to work out in the longer term, and you're probably going to end up in a world where you have effectively defanged some of your most serious economic weapons and also where others don't necessarily want to have anything to do with your systems any more than they absolutely have to.
Krugman: Yeah. Okay. And the China thing is, so the U.S. tried to actually for a while quite successfully deny China access to the most cutting-edge technology. And there's a fascinating thing in Underground Empire about sort of kneecapping Huawei, which worked for a while. But it looks as if, I mean, I'm not an expert here, but it looks as if the Chinese are kind of working their way past that on the technology front, and then they're weaponizing interdependence right back at us with rare earths.
Farrell: That's exactly right. Now you see different people saying different things and certainly some people from the Biden administration, they claim that this was partly successful. And they also had, I think some very strong ambitions towards, they effectively believed that we were going to see artificial general intelligence emerge in the very near future.
So from their perspective, if you could keep the chip-choke on China for long enough that the United States could develop AGI rather than China developing this with Chinese characteristics. Then after that nobody would care because AGI would become this kind of magical self-licking technological ice cream cone.
Krugman: Not sure I like that metaphor but okay.
Farrell: I don't want to think about what it would be like in fact. I suspect those semiconductors will be pretty sharp on the tongue. But nonetheless their success condition was one where they could keep ahead for long enough that the United States could use AGI to build an insuperable and self-reinforcing technological advantage.
Equally, it's very clear that China is beginning to figure out ways around this, and in particular when it comes to training AI. You see Deep Seek having built this famous R1 model, which is a chain of reasoning model which seems to be not quite at the cutting edge of what the United States is capable of doing but pretty close.
And other Chinese companies have come up with similar types of models. So this effort to try and create a long-term throttle on the ability of the Chinese economy to achieve certain high-tech possibilities doesn't seem to work. And equally, China is also building up its own stack, its own set of possible technologies around energy and places where I think the United States is really falling further and further behind.
Krugman: Okay. I have to say that I hadn't realized that AGI was such a big thing, that basically an artificial supermind that we would get at first. Although why anybody thinks that such a mind would actually take orders from the president of the United States as opposed to just killing us all I don't know. I mean, really interesting and enshittification is kind of a nice thing because the U.S certainly had a lot of special position and maybe in the course of exploiting it may also be throwing it away, which is kind of one story about what the big tech monopolies are doing. Any further comments there because I want to move on to a little bit about the tech guys.
Farrell: Okay. So I would say that the thing about enshittification, and maybe this is a way of moving onto the tech guys, is that another thing that I think that Abe and I and our initial thoughts about weaponized interdependence, we really emphasized the role of government and we underemphasized the ways in which the private sector and private sector companies had their own degree of autonomy and agency. So we kind of assumed that if a government said jump, the private sector actors would ask how high, like in the famous sort of metaphor or whatever.
And so I think that one of the things we're seeing with enshittification is a kind of a weird mind meld that is happening of the Trump administration with a fair chunk of the high-tech sector. So it used to be that you could think about the problem of weaponized interdependence in terms of, is the United States overusing its power? Is it sort of paying attention to its long-term interests in the right ways? Does it have the right understanding of what kinds of trade-offs it faces.
And now I think we're in a world where it really becomes increasingly difficult to tell, say for example, when Trump goes to visit the Gulf States, like a mother duck with all of these sort of tech CEOs following like little baby ducklings, some sort of a muttering along behind them, it's really hard to tell where the business interests of powerful and connected companies end and where the national interest of the United States begins. The two are really beginning to blend into each other, and that's something that I think that neither economists nor political scientists, nor really anybody who I know has really begun to get a a handle on, which is if we're in a world where the platforms, where economics, where we talk about things are increasingly merging together with the U.S. government, what kinds of consequences does that have for the U.S.? What kinds of consequences does that have for other countries?
Krugman: So let's talk a little bit. There were a lot of things that I think I did see coming about at least what might happen in the U.S. but this hard right turn or this whatever it is by a lot of big players in Silicon Valley was not part of what I had in mind. We knew that Silicon Valley was socially liberal. We knew that it was that. There's a lot of complaints that the Democrats have become the party of the educated. And so that in some ways created an affinity.
But you have all of these guys and Peter Thiel, Marc Andreessen, Elon Musk, why don't you tell me your story and then I'll tell you, I've been having arguments with some of my friends where I probably am closer to where you are, but I'm not sure. But anyway, I'll try to make the other case.
What do you think is, what's your story about what happened here?
Farrell: Okay, so I think my story is that there has been for a long time, there has been a pretty strong right-wing faction within Silicon Valley. So Becca Lewis, who is a scholar, has a great PhD dissertation on this, which I think she is working up into a book. It is always been part of the mix in Silicon Valley. And if you go back to read, for example, people like Margaret O'Mara. So there also has been this connection between Silicon Valley and the Pentagon, which disappeared for a decade or two, but which turned out, so it was extremely important back in the early days.
So I think that there has, the politics of Silicon Valley have always been more complicated than I think we have given them credit for being. Equally, as you say, there was this very, very clear ideological and sort of Silicon Valley mindset, if you want to call it that, which people tend to be very socially liberal. They tended to be quite conservative when it came to things like unions but also willing to allow for perhaps universal basic income or significant transfers of money, et cetera, et cetera. So that Neil Malhotra and David Broockman have what I think is the only good data looking at Silicon Valley venture capital elites, founders, other people like that, what their values are. And they more or less are the values that you would have expected.
But you did have Peter Thiel, who has been sort of very, very strongly to the right, strongly anti-democratic for quite a while. You had a few other people around that, around that position. And that position becomes increasingly prominent beginning in around 2016, 2017, when Silicon Valley and sort of a lot of the bloom comes off the rose because people are suggesting that Silicon Valley was in some way involved in Donald Trump's rise. And then during the period of 2020, 2021, there is this kind of this upsurge of right libertarianism in which a lot of people get sucked in. And I think Mark Andreessen is a perfect example of this. He was somebody I used to have slight online acquaintance with, you know, sort of the way that you do, where you follow each other on Twitter and you occasionally chat. And he always struck me as being somebody, his politics were not mine, but he was not a stupid person. And you can watch I think literally his brain get cooked in the 2020, 2021 period as he begins to,sort of read all of these 1950s, 1960s thinkers about the managerial state, sort of about the role of communism and, so you begin to see this shift happening.
And so this is, I think one of the things I really wonder about is I kind of feel like we are cosmologists who are trying to infer what are the patterns of large scale galaxies on the basis of all of this dark matter that we cannot directly observe. And you are beginning to see indications that there were a lot of group chats happening in which I think people's brains got collectively cooked in ways that they have not particularly recovered from. But in addition to this, like all of these, we have multiple different stories, most of which probably have a little bit of truth, some of which are distorted.
A third thing that I think is important here is that Silicon Valley has a story that it tells about itself, until 2015, 2016, which is about how if we connect the world together, we're going to create a world of liberal awesomeness in which Chinese communism will eventually fall, dictators will crumble.
We will have Twitter revolutions left, right, and center. And then when it begins to become clear that not only is that not true, but that social media may be indirectly connected to some of the weakening of democracy in the core states. That mythology falls apart. People who were sort the political coalitions, which were based around that mythology, begin to fall apart.
And I think people begin to look for different ways to justify themselves and the world. And right-wing ideologies are one of the things that they hook onto. That's not very organized, but I think we're in the world of speculation, and so that's a bunch of the different plausible speculations that I've seen floating around based on what we can actually figure out or infer.
Krugman: So clearly really hard right turn leading up to the return of Donald Trump. And one side of the argument is that what really drove them up the wall was that the Trump, sorry that the Biden administration was actually trying to regulate. Was actually trying to limit what they could do so it was a concrete thing. And the other side of it is that they're lashing out over a feeling, sort of the Rodney Dangerfield theory of Silicon Valley, they can't get no respect or particularly that these are people who are regarded practically as demigods, circa 2015 and I don't know how much weight. I tend to be more on the subjective, emotional, but people say to me you're being silly, come on, this is really about not wanting to be regulated. Do you have a take on that?
Farrell: It is very clear that regulation has always been an important part of the thing. Clearly there is a profound hatred of antitrust. So the fact that Lina Khan was in charge of the Federal Trade Commission and effectively the major actor in charge of antitrust policy is something that made people extremely happy, including people who are part of the traditional Democratic coalition.
So if you look at, for example, Reid Hoffman, I would think of him as being a moderate left Democrat. He made it clear during the Harris campaign that he did not want to see this kind of antitrust thing continue. So I think that there very clearly was stronger push towards regulation, also stronger push towards antitrust. And Silicon Valley is all about building up monopolies. You know, if you read Peter Thiel and Blake Masters' Zero to One, it is absolutely blunt in saying economic competition is for losers. I think they more or less say it using those exact words.
What you want to do is you want to build up a moat, you want to build up an impregnable position where you can begin to reap all of those delicious monopoly profits. If you look at Mark Andreessen and his partner Horowitz, if you look at some of the podcasts they made where they get crabby about the ways in which the Biden administration, the people who they were allowed to talk to, they felt were not sufficiently senior or serious to match their importance in the world.
And also Andreessen's interview with Ross Douthat where he makes it clear. And this gets back to the union thing. One of the things about the ways that Silicon Valley thinks about founders, thinks about the power and the importance of the founder is that you ought to be able to make the rules within your own business, that you ought to be able to decide. You are the boss. You are the CEO. You are the person who has the vision and other people more or less within your company are required to run around and to make that vision happen. And when you get all of these cranky and crabby engineers, some of whom have left-wing ideas, some of whom want to protest against this or that, engagement with the military, when these people are trying to tell you how to run your business from inside, this makes you extremely unhappy.
And some of the stuff Marc Andreessen more or less suggests, that universities have been inculcating Communist propaganda into young people is quite explicitly about those kids that are on my lawn, that are in the offices of the companies that I'm investing in. They sort of are basically acting as if they can tell the founders what to do, rather than just buy into the grand, glorious vision. And this is evil. This is wicked. This is a fundamental transgression of the natural order that ought to be.
Krugman: You actually mentioned along the way, Neal Stephenson's novel Snow Crash, which is a really good novel but portrays a world where governments collapse basically. And I thought it was a dystopia, but Silicon Valley types seem to think of it as a kind of an operating manual or a manifesto.
Farrell: So yeah, I think they do. And there's an interesting throughline there, so Neal Stephenson writes this book Snow Crash. It's a very fun novel. Its political values are not my values in a bunch of ways, but it gets taken up by people like this guy Rees-Mogg, who is the father of Jacob Rees-Mogg, of the United Kingdom, a well-known politician. He writes this book called The Sovereign Individual, which is really a transcription of the Snow Crash argument with bits and pieces of economics, of sociology from people like Charles Tilly, more or less saying, this is the world we're about to head into and you need to be on the ground floor if you want to prosper in this world, and we are the people who will tell you how to thrive in this. If you only subscribe to our extremely expensive business newsletter, this then gets taken up by Peter Thiel. He writes that The Sovereign Individual was reissued a couple of years ago with a frontispiece from him, more or less saying that this basically tells us how it is.
And then you get these people like Balaji Srinivasan who was part of the Andreesen Horowitz mindmeld for a while, arguing for what he calls the network state, which is again, we are going to see the world collapse, we're going to see cryptocurrency reemerge and sort of create this alternative order in which we have all of these micro states or these micro network states in which individual founders can basically decree according to their values and in which we will be in for a world of social collapse. And oh my God, is it not going to be awesome?
So it really is a world in which people take these sometimes fun, sometimes weird or problematic science fiction ideas from the 1990s and they try to deliver on them. And the other anecdote that Abe and I have in our book Underground Empire about this is that if you read Peter Thiel he says that the whole idea behind PayPal was inspired by another Neal Stephenson novel Cryptonomicon, which is all about the collapse of the U.S. dollar and cryptography. Plus this convenient hoard of U.S. gold allows for private individuals to take over. And so Thiel says in a couple of places, he says this was the idea behind PayPal. This was what was inspiring us. We wanted to be the replacement for the U.S. dollar, even if we didn't understand jack shit about how it is that international monetary finance actually worked. And so science fiction becomes I guess the unacknowledged legislator of mankind.
Krugman: Good God. We should rewrite the end of Keynes, whose General Theory ends with saying that ideas of economists and political philosophers rule the world. But you're saying it's actually science fiction writers?
Farrell: Yes. Science fiction writers are the defunct scribblers who are actually shaping the world around us.
Krugman: Yes. Just to continue with that a little bit, actually it seemed to me that there's kind of a link between the enshittification story and all of this, which is that if there's a kind of lifecycle of these network, these platforms and so on, that there's an initial phase in which they offer great stuff, cheap to build market share. And everybody says, well, that's great, and why would antitrust be at all concerned? But then there comes a point when they start to harvest it, and then all of a sudden someone like Lina Khan says, well, this looks like monopoly to me. And they say no, we're special.
Farrell: Yeah, I think it's that. And I think there's another side of it, which is Max Weber the sociologist rationalization as well, which is some of this is driven by enshittification. So the backlash I think, is very definitely driven by enshittification and people seeing the problematic ways in which this can happen. But a lot of the people, and I think you see this with Elon Musk, you see this with others. I think that they are inspired by those visions of themselves as grand, heroic remakers of the world. And they want to keep on going from glory to glory.
So this is kind of like the first of the Dune books where Paul Atreides smashes in through the shield wall on his sand worms and everything is going to be awesome and the Fremen are going to take over. And then the second and the third book, and I'm not even going to get into the fourth and the fifth book because they're terrible. It's all about the aftermath of that and how you realize that once you've won a galactic empire, you have to deal with the pain in the ass, all of these problems.
And so I think that if you look at the ways in which people justify themselves like Andreesen, if you read the Techno-Optimist Manifesto, it is kind of this wave of progress that we are going to surf into an absolutely awesome future universe, in which we're going to have other science fiction writers write about these and sort of these AI minds, which will guide us to conquer the near parts of the galaxy.
And so Sam Altman, Dario Amodei. Elon Musk, they all buy into this vision. So I think that a lot of this is driven as much by boredom and by the desire to see yourself as being the avatars of progress, the people who are remaking the world as it is by the more humdrum desire to accumulate power and to accumulate influence, a lot of which is probably pretty boring if you are one of the people who basically saw yourself as a world bestriding colossus, and then you have to deal with the laundry.
Krugman: Yeah. It is this mix. I mean, there are real concerns that, I'm not sure exactly about money, but about control, being able to dictate, not have meddlesome bureaucrats or unions tell you what you can and can't do. But there's also the ego thing, which is the pedestal on which the tech leaders were put a decade or so or a bit more ago They're clearly not on that. They're measurably not on that. We can look at the surveys and see that they don't have that kind of adulation anymore. I think that that's a powerful driving force, that there's a kind of bruised ego issue and that this feeling that I'm supposed to be a God king and what happened?
Farrell: Yeah, no, I think there is a lot of that too. And I think that this is one of the professional defects of social scientists like yourself and myself, is that we do tend to be much more comfortable with trying to figure out what happens in the aggregate with masses than we are with the idiosyncrasies of particular elites, the great man theory has a bad name in the academy and for some quite deserved reasons.
And so this means I think that we are probably less equipped to understand if we are in a world where, as you described it in your newsletter yesterday, where we have this relatively small number of people who have absolutely outsized power, who have accumulated not just economic wealth, but the power to control platforms, to shape a lot of the actual day-to-day existence of our lives.
And if these people, if they talk together, if they imbibe these very, very strange beliefs sometimes from each other, we don't have the kinds of tools or techniques of thinking about how does that actually work. If we're in a world where it is basically 30 or 50 people who are in the same group chats who are responsible for a substantial amount of what happens in our economy. A lot of the ideas we have, the ideas about competition, about mass opinion, they don't disappear, but we have to begin to reconfigure them into a different system that has a different logic to it.
Krugman: Yeah. It's funny, economics as a discipline leaves me very ill-equipped to have useful thoughts on this. But it's funny that economics as a profession in a way, it's a field that is very hierarchical and in a weird way, not in the sense of you literally having power over subordinates, but that there's a ranking. Even before we had sort of algorithms, everybody knew the ranking, and the ability of people to feel bitter because they were generally regarded as the fourth best international economist in the world and not the first. And the role of bruised ego in people's behavior has always been really obvious to me.
Farrell:
I just want to make a plug here for one of my co-authors is Marion Fourcade for who is an economic sociologist. And she has a great article in the Journal of Economic Perspectives with two co-authors, The Superiority of Economists, which is exactly about this way in which economics is organized in ways which really reflect, as you say, rankings and implicit power structures. And this has consequences.
Krugman: Yeah. It's so funny. At least in the upper reaches, you do not do office politics. There are hardly ever any fights within departments. So there's a fight about who will be chairman because nobody wants to be department chair, but everything is based upon outside constituency, your reputation and it's very hierarchical. The other thing I would say, I'm talking more than I usually do in these interviews, I also find myself in meetings that includes sort of elite people at the top of different ranking structures and everybody wants to be what they aren't.
And in particular when reading you about Silicon Valley and all this stuff is I don't encounter tech bros actually but bankers who really want to be taken seriously as intellectuals is something that is a really, yeah. And so the group chats, so you've been following that more than I have.
It turns out that there's a lot that's going on that is in these sort of virtual back rooms.
Farrell:
Yeah, and I think people, they talk to each other and at the end of the day, we are social animals. I think that it is actually at least this is the loose consensus among people who study this from the point of view of social psychology and political science, it is less easy than people think to brainwash others to get them to believe things which are clearly wrong. But it is very often quite easy to get them to suppress their own opinions because they think that there is a majority which is on the other side of the question.
So this is something that I think is quite important about social media and also if you're in this world where you are a superstar, you regard yourself as a superstar, and you have people who adulate you, you also are in a world where you know that there are other people who have different hierarchies. So if there are people who are politically powerful, there are others who you are rubbing shoulders with, some of them take you seriously, some of them don't take you seriously. You are going to be exquisitively attentive to what are the cues as to what you want to do to succeed and not to succeed, who are the people who are validating you, who are the people who are not. And so I think you begin to see peeves being magnified into actual world historical events, again in ways which we tend to assume, I think as a social scientist, this stuff tends to come out in the wash.
If we're in a world where Elon Musk and Donald Trump are feuding over something with Stephen Miller in the middle, I wonder it may be more useful for us to look at the literature on authoritarian regimes and in particular some of the weirder authoritarian regimes to really be able to begin to understand the way in which personal slights, factionalism and sort of complex things that happen behind closed doors can have consequences. Brad DeLong used to call this the Topkapi Palace and sort of phenomenon where you try to figure out what was happening from which corpses floated by you in the river in the morning. And I think we're a little bit back in that world.
Krugman: Good God. Yeah, and it is true, the fact that it's a few dozen billionaires that have so much influence makes this kind of thing, which we would normally not take seriously. We don't want to go too much longer, so you have all of this stuff and this hard right turn of the tech community, but the whole they're not stupid and they have to be aware that in an important sense everything that has made them what they are rests on a basis of science, information. Silicon Valley is built around Stanford and now we have this incredible assault on science. Are we seeing any signs that they're starting to say, Hey, wait a second, stop that.
Farrell: Not nearly as much as you would think. There are some people, I think it's been interesting, I've not looked in the last couple of weeks, but seeing how Paul Graham, for example, he is not rich in the same way that some other people are rich, but he is an extremely powerful figure within the Silicon Valley community, began to disassociate himself from DOGE and also from Palantir in some ways that suggest that some people within the Silicon Valley elite consensus are quite unhappy. But I think we're not seeing nearly as much pushback as we might expect. And I think that some of this has to do with the ideas about AI.
So if you look at the people who are writing from Anthropic, Dario Amodei had this piece which he came out with, Machines of Loving Grace, which again, if we want to talk about science fiction is basically Iain M. Banks' culture novels are the future that we were headed into, and we saw something very similar coming out from Demis Hassabis, this interview in Wired. And more or less what they're suggesting is that we are in a world where in the next few years we're going to have what Amadei says is a country of geniuses in a data center, that we won't need traditional research facilities anymore because we will have AI, which is more powerful than human intelligence actually figuring out all of this stuff for us. And so I think that there are a lot of people who are smart enough to realize that this is a, shall we say, decidedly risky bet for the future of the United States and the rest of the world. But the prevailing consensus about the apocalypse sort of the revelation or whatever being right around the corner makes it harder for them to mobilize a counter action than you might have expected even a few years ago.
Krugman:
Okay, a word for listeners Iain M. Banks, the late Iain Banks is a science fiction writer and has an incredible series of novels, the culture novels, which do portray a kind of utopian culture based partially on artificial intelligence. What's funny about that is that Banks was pretty clearly a socialist himself.
Farrell: Absolutely.
Krugman: Absolutely. Absolutely. And he would loathe what's being done but it is amazing to see that. But anyway Use of Weapons would be my recommended starter book there. Anyway, just one last thing, but one thing that I learned about, I wrote about it on the Substack but that there's kind of a war between the crypto bros and the AI bros. Have you been getting that? Because a lot of what you're saying, it was merged up to a point, but it feels like that there is now actually some break between them.
Farrell: That's something that I have not been following nearly as much so I don't have anything to say about, but I do think that this connects back in some important ways to the conversation we were having at the beginning about the power of the United States to control platforms. So if you look at the Genius Act, which is this act that both you and I have written about and sort of me with Dan Davies, I think we both argue that this is a disaster in different ways in the making. This is an act which is intended to enable stable coins, which are a particular kind of crypto approach to providing stable cryptocurrencies, which are regular not uniquely but usually linked to the U.S. dollar and providing them a space in the financial infrastructure. This is something that is justified by its backers as being a means of allowing the United States to push back against other countries which are trying to introduce Central Bank digital currencies that are seen as being potentially damaging to U.S. power.
But if you look at the debate in Europe and Dan Davies and I have been looking at this quite closely, you see that in fact it's more the other way around. The European Union is desperately trying to hurry up its own digital Euro project because it sees a world in which stable coins come into being, on the one hand injecting a lot of the crazy of crypto into the traditional financial system. And on the other hand, to the extent that it does become used, providing the United States with another means to influence payment systems to hold the European Union to ransom. And in a world where Donald Trump is the president of the United States, Europeans absolutely do not want that to happen. So there's this kind of crypto I think is accelerating some of the contradictions in ways that really could potentially have consequences for the willingness of other countries to entertain the U.S. dollar as a currency of exchange.
Krugman: Wow.Yeah, I hadn't seen until I saw it from you, I wasn't aware that the Europeans were actually seriously possibly going forward with a digital Central Bank Euro. And that would be a formidable thing out there. That could be a really big deal. Okay, is there a way out? I find it kind of terrifying that so much of our future is in the hands of these people who live in bubbles and are surrounded for the most part by people who tell them how brilliant they are but who are resentful because not everybody agrees. Do you see an end to this?
Farrell: Well, one thing that I think is useful to think about is that when we talk about Silicon Valley in generic terms, it can mean a lot of different things. On the one hand, it can mean this relatively small group of people who I think we've been focusing on. On the other hand, if you look at most people in Silicon Valley, at the majority of people who are engaged in the tech industry, their values have probably not shifted that much. So I think that there is the possibility that we'll begin to see different technologies, different founders emerge, especially if a number of the bets which are being made — and I think people who are not particularly rational in other ways are not necessarily rational of making good bets — if these bets don't work out particularly. And I think that there is the possibility of a different Silicon Valley.
I first went to Silicon Valley, my first real experience was at a thing called Food Camp, which was run by Tim O'Reilly. He is a well-known person in Silicon Valley thought. And the late Aaron Swartz had decided that the technologists at Food Camp, which is this annual conference, didn't have enough exposure to social sciences. So he had gotten me and somebody else invited. And for me it was an incredibly refreshing, and indeed I'd say intellectually exhilarating thing because I'm an academic and I'm used to people who, when they see a problem, their initial instinct is let's study the hell out of this. And when you go to a place like Food Camp, it's engineers and their basic attitude is let's figure out ways to solve this.
So I think that there are other elements. There's a different Silicon Valley that is possible, that is there in embryo, which is a sort of more committed towards actually helping government to work in ways that I'm sort of I think DOGE has overshadowed. There are a lot of ways in which you could see a different approach to technology actually having a huge amount of value. But trying to figure out how to get this to the fore and trying to figure out how to make sure that it does not become corrupted in some of the ways that existing platforms and technologies have become corrupted is a huge challenge for political economy.
Krugman:
Okay, well let's hope for a better Silicon Valley and a better future.
Farrell:
Thank you so much, Paul. This was a ton of fun.
I’ve said it before, and I’ll say it again. Silicon Valley tech bros need to take more humanities classes. Science fiction is fine, but it’s not meant to be a manual for your business and worldview! They are like 14 year old boys. Wow!
So the AI bros think their machines, primed with LLMs, are going to replace basic scientific research? Guess no mere lab scientist has been permitted into their webchats to tell them about the role chance favoring the prepared mind plays in scientific progress. Would AI have noticed the effect of mold growing on a Staphylococcus aureus culture? Would AI have directed the series of experiments that, some 17 years later, made penicillin available to fight infections? What about Katalin Kariko’s lonely journey to develop useful RNA technology which, some decades later, was used to produce vaccines that saved millions of lives? Not saying that in silico methods haven’t been useful in scientific progress, but for heaven’s sake, to depend on a group of tech geniuses huddled in some techno AI bunker to produce all further human progress is bonkers!