Trussing the Big Beautiful U.S. Dollar
By Harold James
This paper is part of an initiative from the Peterson Foundation to help illuminate and understand key fiscal and economic questions facing America. See more papers in the Expert Views: Lessons from History for America Today series.
We are in the middle of a very dangerous experiment with the U.S. dollar, and with the international monetary system, whose fundamental driver is a fiscal gamble, perhaps brave, perhaps foolhardy. An overpredicted weakening of the U.S. currency, as a consequence of long-standing current account and fiscal positions, now threatens the leadership and the political security of the United States. Worse still, it provokes the administration to undertake even more risky maneuvers — bets on the future — to save the dollar by creating more demand for dollar assets. These tricks just may work, but the history of gambles for resurrection or Hail Mary passes is hardly encouraging.
The U.S. dollar has been at the center of the international payments and monetary system since the mid-twentieth century. In the early years after the Second World War, the United States was the dominant manufacturing producer — practically the only maker of machine tools, for instance, since the other major sources, Germany and Japan, were devastated. Everyone needed dollars to buy goods for postwar reconstruction, and the consequent dollar shortage constituted the major complaint about the international system. By the late 1950s, U.S. overseas investment, as well as military spending, had produced such an outflow of dollars that a new worry arose: that the United States was pumping too many dollars into the system, exporting too much capital in a bid to dominate the world. American outflows initially appeared analogous to the nineteenth century experience of Britain, where capital exports grew and grew.
The United States did not follow the nineteenth century British model, however, because the overall balance of capital flows was negative rather than positive from the 1970s. Already earlier, the United States attracted large short-term inflows. There are different ways of thinking about the development: for some analysts, the United States was like a bank doing a maturity transformation, making long-term loans on the basis of short-term deposits. Other commentators just focused on the net development of current account surpluses disappearing and deficits appearing in the 1970s. Americans bought more and more goods from the rest of the world, so that the increased deficits in the trade balance required balancing through ever larger net capital inflows. In this phase, from the initial deterioration of the U.S. current account in the late 1950s, an interpretation of the problem captured the attention of policymakers, a diagnosis provided by the Belgian-born Yale economist Robert Triffin. He concluded that the world faced a dilemma. Either the United States would continue to supply dollar credits, and the result would be a buildup of claims against the U.S. that at some point the rest of the world would realize were no longer backed by real assets (monetary gold) and would then run on the dollar claims. Or the United States would not supply enough assets, in which case the deflationary dangers of the interwar years, and the immediate postwar period, would reappear. The picture appeared to replicate the position of the interwar era, when Britain was still the provider of the world’s major reserve asset. Britain was exercising a deflationary pressure on the world, and at the same time, there were substantial claims on London. The crunch came in the late summer of 1931, and was resolved by Britain leaving the gold standard. An equivalent analysis to the Triffin argument of the late 1950s had in fact been made by the great Polish economist Feliks Młynarski in a 1929 book, Gold and Central Banks.
One consequence of Triffinism was a widespread belief that the U.S. position in the international system was unsustainable. There were indeed wobbles, which in August 1971 led Richard Nixon to close the gold window (i.e., end the formal convertibility of the U.S. dollar into $35 per ounce of gold) and impose an emergency 10 percent tariff. Or again in the late 1970s, when loose fiscal policy and high levels of U.S. inflation created a new lack of confidence in the dollar, which Jimmy Carter discussed in terms of an American malaise. But in fact, the dollar became more and more central to the operation of the international financial system, and there is little sign in the data of today that its dominance is being shaken.
Nevertheless, the Triffin analysis was revived and appeared to become ever more compelling as U.S. fiscal deficits, the debt level, and the share of the debt held abroad mounted. The new Triffinites followed their master in warning about an impending doom. The debate was supercharged by the Covid pandemic and, by reflection, on the political dynamic it produced. As Covid crippled the economy and cost jobs, the first Trump administration worried about the political fallout and embarked on a stimulus program, with checks mailed out bearing the President’s signature. The 2020 CARES Act brought a $2.2 trillion economic stimulus packet, with $1,200 payments to single adults and higher amounts to families. The appeal of the stimmies, which came near to putting Trump into a second term in the 2020 election, was such that the new Biden administration believed it needed to follow with its own version, so that it could defeat Republicans in the 2022 midterms and the 2024 presidential election. Thus, the Biden administration followed with the 2021 $1.9 trillion American Rescue Plan, which also included direct payments up to $1,400. In 2025, without any obvious economic emergency, the second Trump administration and its supporters started to debate a new round of personal payouts, perhaps to be funded from increased revenues from the higher import tariff rates. As the President put it, “We’re taking in so much money that we may very well make a dividend to the people of America.” Assessing the impact of the $3.4 trillion Big Beautiful Bill involves however offsetting a mild stimulatory effect (with estimates of 0.2 percent) with a substantial drag of approximately 1 percent imposed by the high tariffs.1 The result means continuing deficits of over 6 percent of GDP in an economy that is running hot.
The Covid experience precipitated an intensification of a democratic dynamic that was already observable in the later twentieth century. Spending more and taxing less were politically popular — or populist. Poorer or middle-income countries that succumbed to this dynamic would face debt crises. Rich countries on the other hand had a greater credibility, and could in consequence run deficits for longer. As interest rates fell, they looked as if they were having a free lunch, and the case against adjustment (now labelled as austerity) became ever more compelling. Countries that resisted, notably Germany, whose long-term Finance Minister Wolfgang Schäuble was proud of the Schwarze Null, or a balanced budget with no new debt, were ridiculed and lampooned. Elsewhere, in the rich industrial world that saw itself exempt from the kind of debt crises that periodically hit poorer economies, building up debt looked like a way of managing political expectations (see Figure).
The behavior of modern democracies, and their inbuilt proclivity to use the opportunities offered by financial innovation to run deficits, is quite different to that of earlier states. The old story, as best exemplified by Britain, was to build large surpluses (or pay off debts) in peacetime, so that the country could be prepared to run the deficits needed in case of a large-scale war. After the war of the Spanish succession, after the Napoleonic wars, even after the First World War, there was a massive and painful belt-tightening, austerity justified by the ever-present possibility of a security threat. Debt levels were reduced. After the Second World War, however, first the Cold War atomic peace generated by the balance of terror, and then after 1990 the sense that the new world was going to be peaceful forever, obviated any need for precautionary austerity. It was only after the Russian attack on Ukraine in 2022, with the realization that more of such fundamental disruptions were likely, that the old uncertainty about security and conflict reappeared. Deterrence had failed to guarantee a world permanently free of great power conflict.
One further development appeared to make the free lunch argument for permanent deficit democracies even more compelling. As industrialization spread, and as emerging economies emerged with stunning rapidity, they saved more. The resulting savings glut was used, notably by Fed Chair Ben Bernanke, to explain the large global imbalances (the imbalances that allowed the United States to run twin budget and current account deficits). But this development too would prove to be a transitory moment of comfort. As societies aged, with the prospect of demographic decline in big emerging markets (most spectacularly in China), it was clear that the source of saving would be temporary, and that funds would be scarcer as older populations wanted to spend down their savings.
There was then a temporary window, produced by the triple coincidence of sophisticated financial markets, an international order that guaranteed peace, and a fast growing and saving middle class in emerging markets, that led the industrial world to think that they could resolve the growing demands of their democratic audiences.
Consequently, many rich governments overdid the surge in unfunded borrowing. An initial signal came from the experience of the brief-lived Liz Truss government in the United Kingdom in September and October 2022. “Trussing” is an old English verb with differing connotations. It can mean putting in beams to support a building or a bridge, but also tying a turkey up before it is roasted. Since the brief British government of Prime Minister Truss in September and October 2022, it is also a term to describe the terrifying predicament of governments with large fiscal problems that make unrealistic promises. One sense of trussing is stabilizing; the other can be fiery.
Truss’s government fell apart in 2022 after £45 billion unfunded tax cuts created financial panic. The central bank stepped in to buy government securities, but announced a time limit for its support operations: October 14. That was exactly the day that the finance minister (Chancellor of the Exchequer) had to resign, and Truss found someone who could work with rather than against the markets. But her government was over, and within days she, too, was out.
The Truss experiment was widely seen at the time as revelatory, and ominous for countries with fiscal stress. Some European countries looked vulnerable to an analogous collapse, but it was in particular read as a warning sign for the United States. Indeed, some observers thought that the U.S. Treasury and the IMF were exaggerating the critique of U.K. tax policies in order to make a statement about American problems. It was in fact easy to see a repeat of the 1960s, when Britain’s struggles with the international monetary system preceded those of the United States, and markets treated Britain as part of the “outer perimeter defense” of the dollar.
The turn in interest rates marks the beginning of a new stage in thinking about the stability of debt. Another powerful warning came from Japan, the largest holder of U.S. government debt, amounting to some $1.1 trillion (China used to be in that position, but it has reduced its holdings). Since 2019 the price of 30-year bonds has fallen by half, making for big losses at the Bank of Japan, for insurers, for pension plans, and for private pension holders. As the Japanese interest rates rise, Japanese securities will become more attractive for their investors, and so the carry trade which saw them buying up U.S. securities will be less attractive.
In this new environment, there were two roads that the United States could take to ensure that the dollar remained at the center of the international system. They are both bets on a technological revolution, centered around AI, that is transforming economic activity in a way unparalleled since the early industrial revolution.
The first considers the effect of large investment in AI in raising productivity levels and hence overall economic performance. The basic debt dynamic equation rests on the relation between interest rates and the rate of growth. Pushing up growth makes debt more sustainable. A bet on growth is the last gasp measure of many desperate governments: the United Kingdom in the 1970s tried that, with Chancellor of the Exchequer Tony Barber engineering a “dash for growth” and Cambridge economists propounding the theory that pushing up investment would create new producer goods, and then more consumer goods, and bring down high rates of inflation. The Truss government had its own utopia of stunning growth, with a promise of a supply-side revolution based on “changes to the planning system, business regulations, childcare, immigration, agricultural productivity and digital infrastructure.” But AI gives a greater chance of such a transformative growth experience based on a supply-side revolution; dramatic productivity increases may no longer just be utopian dreams. The Congressional Budget Office suggests that a 0.5 percent increase in TFP over their baseline estimate would lead to a debt level in 2055 of 113 percent of GDP rather than the 156 percent currently projected, and real GNP per person 17 percent higher than in the current baseline. That would be a prosperous and happy America: the downside projection of a 0.5 percent lower TFP growth rate gives a bet level of 203 percent, and income 14 percent lower.2 The current investments are large. Figures suggest that big U.S. AI companies have increased their infrastructure investment estimates for 2025 from $60 billion to $350 billion.3
There is also a prospect of technology making for a lower need for government spending. The case is especially dramatic for healthcare, where the effects of an aging population, with more chronic diseases and more need for long periods of care, along with increasing costs of treatment, including pharmaceuticals, raise a prospect of continually higher government spending. The Centers for Medicare and Medicaid Services estimates that healthcare costs, which in 2022 amounted to 17 percent of GDP, will rise to 20 percent by 2031. The projected budget costs are even more dramatic, with Medicare accounting for 4 percent of federal government expenditures in 1975, 13 percent in 2025, but rising to an estimated 19 percent in 2055. Is it possible to imagine a reversal of the trend?
There are many exciting possibilities that are already being realized. Cancer treatment is being transformed, including by the application of the mRNA treatments popularized in the Covid pandemic. GLP-1 drugs developed to treat diabetes and obesity have already produced dramatic reversals of chronic conditions, appear to reduce cardiovascular risks, and promise healthier aging. AI is increasing the speed of development of new drugs, with AI-driven drug discovery by projects such as OpenAI’s Chai Discovery or Google DeepMind’s Isomorphic Labs. Given that cardiovascular conditions and cancers are common, and their treatment expensive, the possibilities of cost-reducing innovation appear considerable.
But there is obviously no guarantee that the AI productivity-enhancing investments will pay off, or how the healthcare revolution will work. In particular, it is not clear whether the United States will still be at the cutting edge of innovation. We don’t know how far China is in developing an alternative AI system. The story at the beginning of 2025 of the success of the very inexpensive DeepSeek — rivaling the billion-dollar projects of OpenAI — suggests that it is easy to piggyback on other research, and that breakthroughs might occur in surprising places. The barriers to catching up are becoming ever less with the development of an educated and hardworking global middle class, so that the institutional forces which ensured that technology remained the domain of a few advanced societies are now eroded. There might be a dangerous race to control AI, in which it is not clear that very large investments in the United States make that country win.
Bets on resurrection often turn out to be an application of zero-sum game logic. Think of countries that have recently escaped from severe debt overhangs. Ireland could rebound with apparently impossible debt levels in 2010 that prompted the IMF (and most Irish economists) to call for a haircut or debt reduction; it could do that because the international economy was relatively resilient and because lower corporate tax levels attracted big global corporations to base their activities there (and some of those activities even generated employment). The United States appears with the Big Beautiful Bill, and the criticism of global tax regimes such as put forward by the OECD, to be playing a version of this game, attracting business back from low-tax environments.
The second gambit to rescue the dollar is rather more dangerous. Dollar assets need to become once more appealing for the rest of the world, even as investors grow leery of countries with high levels of debt and high commitments to debt service. Too many bonds had been produced, with the ECB’s Isabel Schnabel speaking of a “bond glut.” With the over-supply of the old monetary instrument, it was natural to look for a new alternative driven by new financial technology.
An easy path to achieve a new surge of dollar appeal is by creating brand new assets: in particular, promoting stablecoins, digital blockchain ledger currencies that are tied to existing fiat currencies, and in practice are tied to the U.S. dollar. The stablecoin is a dollar, but only sort of. In order to ensure that convertibility, the issuers of stablecoins need to hold a substantial reserve of U.S. dollars. If they held 100 percent reserves, then they would indeed be perfectly stable, but they hold other assets. Tether thus holds gold and bitcoin. Both do well as doubts about the dollar increase. The stablecoin issuer thus appears to be making a successful two-way bet: if the dollar is strong, the stablecoin will be attractive and provoke inflows from new purchasers; if the dollar falters, the alternative assets in its portfolio strengthen.
This argument neglects some important features of the stablecoin universe. They are attractive because they offer uncontrolled, as well as low-friction, entries into the U.S. dollar. When the United States uses financial sanctions, against North Korea, Iran, Russia, potentially more against China, those countries can still use the U.S. dollar. The stablecoin market currently amounts to $285 billion (with Tether at $181 billion USDT and Circle at $76 billion USDC), but is predicted to grow very quickly. Tether was not allowed in the United States, but the provisions of the Genius Act bring it in as a permissible U.S. asset. The current estimates suggest that China transacts $80 billion in Tether annually, Turkey $40 billion, and Russia and Ukraine “hundreds of millions monthly.” These are cases where the holders of Tether largely want to escape observation. But there will always be doubts, which were greatly reinforced in June 2025 by the ingenious operation of the (probably Israeli) hacker group Predatory Sparrow, which had previously disrupted Iranian transportation and communications systems. Iran had some 11 million stablecoin users, with 90 percent handled by Nobitex exchange, founded in 2017, which also had heavy interactions with the Russian Garantex exchange. On June 18, 2025, Predatory Sparrow extracted $90 million from wallets linked to Islamic Revolutionary Guard Corps and “burnt” them, so that they simply vanished. If this sort of operation is possible, it poses a threat to any holder of stablecoins whose interests are not aligned with the United States. In other words, it reproduces the problem of U.S. control that the ingenuity of stablecoin was supposed to escape.
A further problem lies in the multiple issuers of stablecoins, which do not necessarily make for greater security and stability. Historians are well aware of the parallels, and those past cases are often cited by stablecoin skeptics, notably in the 2025 BIS report. In pre-Civil War America, many banks issued dollar notes, which were notionally convertible into specie, but the notes of some banks were better secured than others. Weak notes traded at a discount, and market participants and consumers needed to equip themselves with bulky lists of banks and assessments of their relative credibility. The system in short had an inherent vulnerability to panics and runs, which could only be partially dealt with through the institution of central clearinghouses that shared risk. Thus, stablecoin is reproducing a famous problem of the turbulent years of American banking history.
An even more striking parallel is given even further back in time, by the very early experience of the United Kingdom in the aftermath of a financial revolution that is usually interpreted as the transformative step in the late seventeenth century that put Britain on the path to financial and political greatness. A crucial innovation was a bank that took over government debt, whose shareholders were powerful figures, and could ensure support in the legislature for fiscal measures (above all, excises and tariffs) that ensured the reliable servicing of government debt. That institution, created in 1694 and justified in the parliamentary legislation as a mechanism for “the carrying on the War against France,” was a brilliant success. So brilliant that the alternative political grouping to the Whigs who clustered around the Bank of England were Tories who wanted their own equivalent. That was established in 1711 as the South Sea Company and given a privilege of trading (in slaves and goods) with South America in the 1713 Treaty of Utrecht. In 1720, as the two companies competed with more and more promises of the riches that would accrue to their shareholders, and of the benefits that followed from the use of their banknotes, a bubble economy developed. The South Sea stock in particular surged.
Circle and Tether have different clienteles and use different platforms. Their competition risks repeating the bubble economy of early modern Britain. They will promise more and more market share and weaken themselves as they do that. And the obvious question will be, is it not better to invest straightforwardly in U.S. Treasuries?
Are the twin risky bets on the dollar likely to turn out to the advantage of other countries? It is already clear that the trade and tariff war is imposing great costs on the U.S. economy, and that China is relatively robust in response (in part because it has the powerful cards of at least temporary quasi-monopolies on rare earths, with elements such as dysprosium precisely the magnetic metal required in large-scale computing and also in nuclear fusion technology). Europe has the opportunity of moving quickly with a digital euro that might provide the infrastructure needed for a greatly improved payments technology, using blockchain to solve a thousands-of-years-old question of how to make reliable and secure payments in complex supply changes without a problem in the delivery of the product. Delivery versus Payment (DvP) allows the payment to be precisely linked to the actual arrival of the good and can be split up easily into the stages of the supply chain. In order for that commercial simplification to happen, the central bank would need to provide tokenized central bank reserves, tokenized commercial bank money, and tokenized claims on financial and real assets. Jurisdictions that provide that sort of monetary instrument in a transparent fashion may well look more attractive than a flailing superpower driven by fiscal worries to make a Hail Mary pass on the basis of an inherently flawed instrument.
About the Author
Harold James, the Claude and Lore Kelly Professor in European Studies at Princeton University, is Professor of History and International Affairs at the Woodrow Wilson School, and an associate at the Bendheim Center for Finance. His books include The German Slump (1986); A German Identity 1770-1990 (1989); International Monetary Cooperation Since Bretton Woods (1996), and The End of Globalization (2001); Family Capitalism, 2006; The Creation and Destruction of Value: The Globalization Cycle, 2009; Making the European Monetary Union, 2012; The Euro and the Battle of Economic Ideas (with Markus K. Brunnermeier and Jean-Pierre Landau), 2016. He was also coauthor of a history of Deutsche Bank (1995), which won the Financial Times Global Business Book Award in 1996. His most recent books include Making A Modern Central Bank: The Bank of England 1979-2003, The War of Words: A Glossary of Globalization, 2021, and Seven Crashes: The Economic Crises That Shaped Globalization. In 2004 he was awarded the Helmut Schmidt Prize for Economic History, and in 2005 the Ludwig Erhard Prize for writing about economics. He writes a monthly column for Project Syndicate.
Expert Views
Lessons from History for America Today
We asked respected experts to examine America's present challenges through an historical lens and identify lessons learned for navigating our nation onto a stronger, more prosperous path.