With President Biden’s proposed $1.9 trillion package looking likely to pass through budget reconciliation, it’s worth considering the macroeconomic consequences of such massive spending. This is on top of the $2.2 trillion CARES Act in the spring of 2020 and a further $900 billion of pandemic-related relief passed right at the end of 2020. The bottom line is that this new spending is unlikely to cause an inflationary event or meaningfully replace private sector investment.

Full Employment

To start, it’s helpful to understand the framework that the Federal Reserve, Congressional Budget Office, and most mainstream economists use to think about when an economy is “overheating” and when we should fear inflation.

Economists have a concept called “full employment” – the point where everyone who wants a job and is capable of working has found a job. Full employment does not mean an unemployment rate of 0% because it includes both “frictional” and “structural” unemployment: Those who are in between jobs in the natural churn of a dynamic labor market, and those whose skills do not match the needs of the economy right now (we’re not too worried, rightly or wrongly, about people whose skillset is exclusively making VCRs). When the unemployment rate is higher than full employment, economists say the labor market has “slack.” Slack is idle labor sitting around not fulfilling full potential. These are people who could be productive and want to work, but for whatever reason are not employed. During a recession or financial crisis, you can easily observe ample slack: Tons of people are out of work who only recently were very capable of working.

A recession leads to what Keynes called the “paradox of thrift” where an economic slump causes businesses and households to be risk-averse and cut back on spending and investment. This means less aggregate spending in the economy that leads to fewer people being employed, and it becomes a vicious cycle. But when there’s slack in the labor market, government spending can put these people to work. This increased spending will, in theory, get more people back to work and get the labor market back to full employment. It will also give people more confidence to spend again. The Federal Reserve can also use monetary policy to increase investment. In the simplest explanation: The Fed makes it cheaper to borrow and easier for banks to lend out to households and businesses.

But what happens if we try to go past full employment? This is the concern from skeptics of the latest $1.9 trillion fiscal package. When everyone who wants a job and is capable of working has a job, businesses need to compete with each other for workers by raising wages. In a lot of ways, this is great! When we have a “tight labor market,” bargaining power shifts to workers and they can demand higher wages. When there’s a lot of slack, workers have less bargaining power because employers have a lot of readily willing people to choose from and don’t have to compensate as well. But when we’re at full employment and there continues to be an increase in spending, the economy is then “overheating.” Businesses raise wages and then raise their prices. When both of these are going up, we have what is popularly called “inflation.”

Inflation

It’s important to take a slight detour here to describe what inflation is and what it is not. Usually in the press we see inflation defined by the year-over-year increases in the Consumer Price Index, which is a price index of a basket of stuff that your typical American buys. But a big jump in CPI on its own is not inflation. Sometimes it’s because of mathematical quirks, sometimes it’s because of a relative price change. For inflation to be true inflation, it needs to be what was described above: A persistent acceleration of both wages and prices.

You might remember that in the spring of 2020, oil prices on some financial markets went below zero. This is important because when CPI is reported for March or April in 2021, it will be expressing a change compared to the same month in 2020. If energy is insanely cheap or basically free in 2020, it’s going to come across as a big spike in the 2021 numbers. Not because March 2021 has oil spiraling upward to unreasonable numbers, but because it’s being compared to such a low number. This isn’t a purposely deceptive trick anyone’s pulling, it’s just the math of it. But if oil in March 2022 is $60/barrel (roughly where it is now), the CPI in March 2022 will be unaffected by the price of oil. In this way, it’s helpful to think of a big shock like the drop in oil in the spring of 2020 as a one-time event and not true inflation.

Many economists like to look at “Core CPI” which takes out the more volatile components of food and energy. The motivation, I believe, is to get a better sense of price levels by stripping out the noisier aspects. But there can still be some elements that again are one-time events and not true inflation. For example, the CARES Act in the spring of 2020 set limits on rent evictions and moratoriums. Rent is a solid chunk of the CPI basket – after all, it’s a big portion of the cost of living for pretty much everyone. But during this time, people who were not paying rent due to whatever reason were showing up in the data as paying zero in rent. The cost of rent was not zero in any meaningful sense of how we measure prices, but it did cause downward pressure on that part of the Consumer Price Index. When the moratorium is lifted, it might appear that rent for those people goes from zero to $1000/month, or whatever they had always been charged. This could cause the appearance of rent skyrocketing, bringing up CPI and giving an impression of inflation. But it’s not! It’s just a change in policy. So a one-time oil shock or a policy initiative like rent moratoriums are two examples of things that can cause illusory spikes in the metrics we tend to use to measure inflation. Economists and many financial market participants are aware of these things, but it’s still important to acknowledge that looking at CPI purely is not a good gauge on the level/existence of inflation.

Finding Full Employment

But back to our normal programming. It’s difficult to know exactly where full employment is. How do we know when everyone who wants a job and can work has one? It might sound obvious, but it’s not. There are endless sources of noise that can complicate “everyone who wants one” and “everyone can work.” For example, a weak labor market can make people who feel their options are sub-optimal go back to school. Or someone out of work in their early 60s may be nudged into early retirement because they give up on trying to work a few more years. Because both of these groups aren’t actively “looking for work,” they are not considered unemployed. And how do we really measure who is in school for “lack of job” reasons and who’s there despite good job opportunities. Same with early retirement – did someone do this because they’re financially healthy and are happy to retire now, or because they have no good option? In most instances, these groups have a combination of both forces. And usually the last people to get hired in a tight labor market are those that are seen for whatever reason as less desirable to employ – think of people who have been out of the labor force for a long time or the previously incarcerated. In a tight labor market, an employer will bet on hiring these groups that are seen as riskier hires. But otherwise, these groups could be discouraged from job hunting and stop looking, further complicating the goal of knowing when “everyone who wants a job can get one.”

There are a lot of signs that previous estimates of what unemployment rate constitutes “full employment” were very wrong. In December 2019 (pre-pandemic world), the Fed had thought the unemployment rate associated with full employment was 4.1%. And this was in contrast to their projection exactly two years earlier of 4.6%. The historical idea of what unemployment rate represented full employment had to be continually revised downward because we kept getting the unemployment rate lower without any spike in prices. In theory, any decrease in the unemployment rate past that should have led to overheating, and an inflationary event where businesses have no choice but to raise wages and prices in response to there being no one new to hire. But it didn’t happen. The US got its unemployment rate down to 3.5% in February 2020 with CPI and PCE (a different measure of the price level) never going above 2.5%. Furthermore, wages started to tick up (this is good!), suggesting a tightening labor market that wasn’t being accompanied by consumer price increases.

We can look at a few indicators that, in hindsight, show us that the labor market still had room to tighten at this point in time, that there was still slack. First, consider the “Employment to Population” ratio, which is a measure of what percentage of people above the age of 16 are employed.

You can see that February 2020, at 61.1%, was lower than the pre-financial crisis peak in January 2007 of 63.3%, and in fact lower than anytime between February 1987 and December 2008. Remember that underlying that trend is a steady increase since 1950 in the female participation rate – the percentage of women above 16 who have a wage-paying job or are actively looking for one. The female LFPR plateaued in 2000 more or less, and we can probably interpret that as the changing cultural forces of women working reaching somewhat of a ceiling (for now, maybe).

But the increasing participation of women means that those aggregate employment-to-population data include a huge decrease in the employment-to-population ratio for men. Take a look at these charts:

Is this because lots of men have decided to stay home and take care of the kids? Highly doubtful there’s strong data to support that explanation for such a huge drop in the last half century. Are more men deciding to attend higher education? Again, no. Any statistic will be an incomplete snapshot of whatever it is you’re trying to measure, but I think the main takeaway here is that a lot of men were not employed who wanted to be. (This fits well into the narratives suggesting lower marriage rates are in part due to a shortage of “marriageable men,” and also “deaths of despair.”)

In fact, I’d like to put forth that we haven’t been at full employment or full capacity for quite some time. Remember that in the framework of full employment we should see an acceleration of prices. And, the Federal Reserve’s target for price increases is 2% annual changes. This means that with prices increasing at sustained levels faster than that, they’d start to freak out and raise interest rates to stop the economy from overheating. Up until that point, they’re (in theory) happy to let the economy keep getting hotter. So take a look at this chart of the yearly changes of the relevant price index the Fed likes to use.

There were little momentary blips above 2% in the last few decades but nothing we’d consider an inflationary event. In fact, the last time Core PCE was at 2.5% was the first quarter of 1991. There are a number of theories on why the economy hasn’t run hot enough to get there for so long – I take a stab at it later on in this post – but the main takeaway is that we don’t have good data suggesting that the economy has been operating at full capacity for a really long time.

This is all important because when economists try to consider what level of increased government spending will lead to inflation, they try to look at the “output gap” – the difference between what our economy is producing and what the economy is capable of producing. The Congressional Budget Office, the Fed, and others have an estimate for what “full capacity” GDP is, now typically off a benchmark of where we were in February 2020. The problem is that if we weren’t at full employment or full capacity at February 2020 or perhaps the last couple decades, trying to get back to February 2020 levels of employment/GDP will still be short of what we want. February 2020 and the summer of 2007 may have been peaks in their relative cycles, but that doesn’t mean we were at full capacity then. So trying to fill the output gap of where our economy would be on trend after February 2020 without the pandemic is still going to be an undershoot of full capacity.

How to Know It’s Too Much

The financial markets now are expecting inflation in 5 years to be just over 2%.

This might look like a sign to sound the alarm, but it’s just above 2%. And this is likely pricing in the $1.9 trillion package passing, which means that almost $5 trillion in extra government spending from the pandemic – about 25% of the annual output of the United States – has made a pretty minor blip in the inflation expectations rate.

Another cause for concern from spending trillions of dollars from government borrowing is the potential for “crowding out.” The idea here is that government spending through borrowing is using resources that would otherwise be spent on private investment or consumption. If the government is taking money saved by households or businesses, money to invest will become more scarce and so the cost of borrowing for everyone else will become more expensive – interest rates will go up. But in the case of a slack labor market, where the “paradox of thrift” rules, people have made resources idle to save for a rainy day. Government spending gets those resources active.

The political discourse has always been mindful of this crowding out, so we’ve always felt like we’ve gotten to the brink of it without really getting there. The Obama stimulus in 2008 was less than $800 billion and it was enough to give rise to the Tea Party. The dire circumstances of the pandemic have given us an experiment to see just how far we can go. Not since World War II has there been such urgency to get money out the door in such a big way. What’s then amazing is how small $800 billion is compared to what the Federal Government has done already even without the newest potential $1.9 trillion package. And yet…the world hasn’t ended! Interest rates budged up a little when you look at this chart showing the interest rate on 10-year bonds in the last two years:

Whoa there! That’s a turn up for sure. But then zoom out for the longer run picture and that blip up is…not much.

I think the takeaway here is that there has always been a huge amount of excess saving in the world and we never knew how much room there was for the government to borrow because we never wanted to test where the edge was. Turns out there was a lot more money that was willing to go to the US government than we expected. But admittedly, looking purely at nominal interest rates is an oversimplified way of gauging the level of crowding out or pressure on the public fiscal situation. A discussion for another day too is that government spending is not an apples to apples comparison to private spending: The discretion of the different actors can change the quality/value of what that money is being spent on. But to me it very likely that wherever the “edge” is for when government borrowing will lead to crowding out or inflation, we are not at that point right now.

This Time is Different: Capacity Constraints And Household Balance Sheets

What’s indeed very different about our economy right now compared to 2008, 2001, or the vast majority of economic downturns is that our capacity is being restricted by the pandemic. Businesses and households are fully capable of doing things they aren’t doing, not because anything in the economic machinery is broken, but because they or the government have deemed the public health risk is too great for them to operate as they would otherwise. And this is the most legitimate concern for those worried about inflation: Our capacity to produce is limited by the pandemic, so previous levels of income chasing less production will lead to a spiral upward in prices. But this is overstated for a number of reasons.

There’s been a lot of suffering – economic, mental, and of course in terms of public health – in the last year. But it’s important to consider the national aggregates for things like savings and income. Because the suffering has not fallen on everyone equally, the national picture can look quite surprising, namely because those in the top quintile have more or less kept their jobs while saving a lot more money. Take a look at this chart from tracktherecovery.org about the employment picture for top earners versus the lowest earners.

The number of jobs has actually gone up for the highest earners. And then take a look at the national savings data:

Wow, that’s wild! This national number shows that a) people who maintained pre-pandemic income were left to save a lot of money because they weren’t able to spend on vacations, eat at restaurants, and spend on a number of expensive things they’d otherwise enjoy; b) the fiscal packages so far have in part given money to a decent number of people who don’t have an immediate need to spend; c) it’s a once-in-a-century pandemic with a huge recession and so people are going to save in the event that they may lose their job sometime soon.

In fact, nationally, income is looking pretty good!

Pent-Up Demand

So back to the matter at hand: With 2 trillion dollars in personal savings, will there be a mad rush of spending once the economy opens back up later this year, once government restrictions ease up, vaccinations get to a critical point, and (hopefully) most of the pandemic is in the rear view mirror? Well let’s consider where this “pent-up demand” will go. To the extent that people will spend their saved money, where could it go? The industries most impacted by the pandemic aren’t going to be back to where they are in February 2020. Restaurants won’t all reopen, lots of places have gone out of business, supply chains have been decimated. So if people all try to spend like they used to on industries that exist at 50% of their previous levels, those industries can’t keep up with the demand and they’ll have to raise prices. This is what the inflation-hawks are worried about.

While people may splurge on going out to eat a bit more, going on a vacation, or going to a concert, they’re not going to make up for a year’s worth of lost time entirely. They’re not going to the bar to get 10 drinks instead of 2 or get 4 haircuts to make up for the only one they got in 2020. They’re not going to go on every vacation they would have otherwise between March 2020 and August 2021. Or see a year’s worth of movies in one month. And if they did, think about what this would mean: movie tickets that are usually $15 would now be $50. Restaurants with $7 beers will now be $15. But: Most of the places with remaining capacity issues have goods and services with what economists call elastic demand. You’re not going to spend $50 on a movie ticket. Personally, I’ll probably be willing to spend more on some travel just to get the hell out of New York. But for an inflation event to occur, you’d need to see these industries have a huge run-up in prices. More likely, people will shift to spending more on a new streaming service rather than pay $50 on a movie ticket. They’ll have a backyard bbq with some grocery store beers instead of having a $15 beer at a bar. Or they’ll get a Peloton instead of a more expensive gym membership. So the bottom line is that we’d see people shifting their consumption to industries that do not have the supply constraints of the damaged industries. When oil prices surged in the 1970s, people weren’t going to stop heating their homes or driving to work. They probably cut back on gas consumption a little, but their demand was “inelastic,” and thus their consumption was not as sensitive to price changes.

Another element is that, just like every other economic downturn, people will be skittish about spending all their savings. There’s real psychological scarring from a recession and I would think that, more than previous recessions, this one was really in your face in terms of lifestyle and news, no matter what your income situation has been.

What’s the deal with all these inflation worries?!

If you look back at the charts showing the year-over-year changes in price levels, you might wonder why policymakers have been so worried about inflation. It’s true that the typical framework suggests that once inflation happens, stopping that process can be pretty painful. So getting to the edge and then realizing it’s already happening means… it’s too late. But I think that too many of the powers that be have been scarred by the stagflation of the 1970s, where low growth and high inflation wreaked really bad havoc on the economy. (As far as I can tell, not nearly as painful as a financial crisis recession, or the aggregate misery from having an economy with millions of underemployed people.) The Federal Reserve hiked rates tremendously, causing a really bad recession – though seeming to choke off inflation – in the early 1980s and making central bankers vow to never relive the episode again. And many of the people who lived through that were the PhD advisers to people who are now our philosopher-kings of economics. But for the current generation of economists, there’s just no livable experience that justifies being scarred by a painful inflationary episode. A 40-year old person would have been a newborn the last time core PCE year-over-year changes were at 3%. In a sense, this older generation of economists is too busy fighting the last war, and the younger one is much more concerned with inequality, student debt, and sluggish growth. To this extent, one should be optimistic about the future in terms of recalibrating inflation fears.

And there’s further evidence that The Federal Reserve has made it clear it will tolerate above 2% inflation, so don’t expect interest rate hikes any time soon. Jerome Powell, the current Fed Chair, has testified that he acknowledges the Fed has been wrong about their estimate for full employment. Lael Brainard, who is on the Federal Reserve Board of Governors, has also stated that the Fed is baking into its projections a revised estimate for what full employment is. On the monetary policy front, we can be hopeful that we’ll be closer to running the economy as hot as it can get without inflation fears.

On the fiscal front, this $1.9 trillion may not be perfect in its composition – nor an exact filler for any output gap – but we shouldn’t fear the package leading to harmful inflation or meaningful crowding out of private investment.

Advertisement

I wrote a review of Matt Yglesias’s recent book One Billion Americans for the LA Review of Books. Check it out.

Excerpt:

One Billion Americans has a lot of excellent policy ideas: increasing the American population through Yglesias’s prescriptive policies would be beneficial and morally just. And the advantages of population growth themselves should be given more attention. But for those already in agreement with Yglesias, it’s difficult to find much fundamentally new in his arguments or framing. For those in opposition, the book is unlikely to convert.

Yglesias waffles between arguments he believes are pragmatic and those that are encouraging “thinking bigger.” Both of these have value, but he switches between them at random times, operating seemingly at whatever level is the path of least resistance. Are we meant to take his idea of a billion Americans seriously but not literally? His ideas can support a much higher population even if it’s not a billion, but he struggles to establish when it’s time to think pragmatically and when to just think bigger.?

The Chicago White Sox, my beloved baseball team for my entire life, made the playoffs this year for the first time since 2008. I enjoyed tuning in for the quick three-game series they lost to the Oakland Athletics. The occasion didn’t involve nearly as emotional weight for me as the big games for my sports teams in previous years. I cried when they lost in the 2000 playoffs, felt immense rage when the Bears lost to the Colts in the Super Bowl, and believed I was personally harmed when Derrick Rose tore his ACL. I couldn’t name every player on the Sox this year and I hadn’t watched any regular season games. I’d like to think I’m a mature adult now with a better perspective on what’s really important life, but I also recognize that I follow politics a lot like I used to follow sports.

Ezra Klein compared the intense tribalism from increasing political polarization to sports loyalty in his book Why We’re Polarized. We have come to attach so much energy to our sides winning in politics that we lose sight of the actual substance underneath. Political parties have come to match so closely to our identities, he argues, that our side losing feels like a threat to our identities. Donald Trump shares little in common in terms of policy with the Republican Party of 2012, yet the overlap in voters is pretty strong. Our instinctive reaction to any political development is more about what side is doing the thing, rather than whether it’s good/bad for society.

Tyler Cowen recently had a post about declining sports viewership and included one possible explanation as a crowding out from “political fanaticism.” I think it’s correct that an outrageous President and the accompanying newscycles have sucked the air out of the room where people used to pay attention to sports. And I realize there’s a huge overlap between how I used to follow sports and how I follow politics now.

How I used to follow sports:

  • Spend a little time every morning checking the results of major events
  • Read about the history of different sports/players to better develop an encyclopedic knowledge of each sport and better understand the big picture context of current events
  • Follow my team closer than other teams and feel an emotional attachment to their success/failures
  • Be defined almost as much by what I loathe (the Cubs) as what I support (White Sox)
  • Get a sense of companionship from talking about sports with similarly informed people

How I now follow politics:

  • Spend a little time every morning checking 538 and Twitter to see the status of the horse race in different elections
  • Read about the history of different elections/politicians to better develop an encyclopedic knowledge of the political system and better understand the big picture context of current events
  • Follow my country’s politics closer than international ones and feel an emotional attachment to my preferred party’s success/failures.
  • Be defined almost as much by what I loathe (Trump) as what I support (sometimes I’m not even sure?)
  • Get a sense of companionship from talking about politics with similarly informed people

I used to justify the attention I gave to politics as “this makes me a better-informed citizen.” Sports viewership is purely a hobby and learning more about the current/past results doesn’t make me a better person. But I always knew that. It was just entertainment. Indeed, I do think that pre-Trump I used to follow politics in large part because it was the necessary vehicle through which different policy preferences (which I spend a lot of time reading about) were implemented. On a good day, following politics closely might increase my personal civic engagement, but there are a number of ways where that justification is just BS.

  • Following the horse race of the Presidential campaign is where I spend most of my Online time. There is nothing of substance here. It’s either a level of reassurance that Biden will win, or disaster porn of some kind.
  • The way to actually make a difference in politics is through local government, not national elections. I can tell you my two New York Senators and the Mayor but not my Representative in the House, my state senator, or any other local officials. If my goal was really civic engagement, I wouldn’t spend so much time worrying about Trump.
  • I closely followed hourly developments in things like the Mueller probe that 1) did not affect my life 2) I could not impact and 3) only served to satisfy my urge to see Trump get punished.

Since I became a grad student and cord-cutter, it’s been difficult to keep up with any sports. Going to a bar for 4 hours to watch a Bears game on a day off was exhausting and expensive. Watching enough Sox or Bulls games to feel connected to either team was tough to find on tv and took too much time. But while my sports viewership has gone down markedly, it has been completely substituted by the spectator sport of Politics. It sucks, but it is damn addictive.

I recently finished Money: The True Story of a Made-Up Thing by Jacob Goldstein, co-host of NPR’s Planet Money podcast. As with Planet Money, the book has something of value for people well-versed in economic history/know-how as well as those completely new to anything economics. The book also shares Planet Money’s uncanny ability to use (often quirky) stories to make a point. I found Money to be a short, digestible, and – given its length – surprisingly comprehensive look at the history behind society’s evolving definition of, use of, and attempts to reign in the power of money.

The biggest takeaway for a reader of any background is found in the book’s conclusion: “…the reminder that there’s nothing natural or inevitable about the way money works now. We know money will be very different in the future; we just don’t know what kind of different it will be.” We all have so intensely internalized what “money” looks like, what it can and cannot be, what government actions will cause inflation, that even people swimming daily in finance and economics forget how malleable the nature of “money” has been.

This valuable lesson comes from the book’s rich history of what societies have used for money and how they’ve dealt with certain difficulties along the way. The barter economy Adam Smith described in Wealth of Nations, the book points out, never really existed. People always exchanged their stuff for what they wanted from other people. There wasn’t any currency as we think of it at first, but there was reciprocal gift-giving. “A power move, like insisting on paying the check at a restaurant,” as Goldstein puts it.

I found it notable that, despite not having the bustling marketplaces we see today everywhere in the world, it was still commerce that brought the earliest communities together: The Greek agora was meant to act as a meeting place for civic discussion, with a sideshow for a place to allow people to exchange their wares. “In the long run, shopping won out over public discourse.” I’ve always thought people underestimate the power of commerce to bring communities together, and this seems to be an illustration of that power. (Sadly perhaps, people have shown themselves to be much more interested in going to the farmer’s market than town hall meetings.)

For as long as groups have declared power over others, they have collected taxes. But without a standard representation of value like money, cloth and grain acted as something the government in China could collect around the 11th century. So everyone had the annoyingly inefficient responsibility to weave or grow to some extent just to pay taxes. The arrival of coin currency from invading Mongols allowed Chinese people to specialize in their crafts and manage to pay taxes by focusing their time on what they did best. This allowed China to flourish centuries before the future economic powerhouses of Europe.

But the man who drove out the Mongols and eventually founded what became the Ming Dynasty wanted to Make China Great Again, and that involved getting rid of the “money” system that had allowed China to be the world’s most advanced nation by the late 14th century. Soon China went back to the cloth-and-grain system and regressed tremendously. The removal of money from China is not exactly the entire story here, but the book shows an interesting experiment about the economic impacts of introducing and then removing money.

The great lesson from this time is that China had, for centuries, thrived under a system where money was not tied to anything like a precious commodity. Money was worth something because everyone else just believed it was worth something. And this is essentially what defines our monetary system today. The dollar cannot be eaten or boiled down into jewelry. Short of the government accepting it as payment for taxes, it doesn’t have any value if everyone decided one day that it no longer had value.

But historically the idea that money was tied to nothing but the government’s enforcement (sometimes by death) of its value typically didn’t sit well with people. Linking currencies to gold or silver was thought to establish credibility, and somewhat limit the ability of warmongering sovereigns to inflate their way out of any problem. (Of course even under such regimes, people would fudge the weight of their coins, and sovereigns often spent their way to fiscal ruin.) The United States dollar was no exception. The convertibility rate of the dollar to gold was essentially fixed, giving predictability to the international financial system and confidence to the users of dollars.

Fast forward to the late 1920s. A banking ‘panic’ – where people all at one time were worried their banks would fail and their deposits would be wiped out – started as any of them did around then. People decided they’d convert all of their dollar deposits into gold. This was before the FDIC insured bank deposits, so people thought the safest place for their savings was out of the bank and into gold. But banks only have so much gold, and as the gold supply dwindled, it created a vicious feedback loop where people’s fear created a run on the bank’s gold deposits.

The Federal Reserve – only created recently in an effort to smooth over panics and financial crises like this – decided to raise interest rates. This was what the playbook said for countries who were using the gold standard. Raising interest rates was as a way to incentivize people to keep their money in banks and prevent the impending bank runs that would cause a financial crisis. The higher rates meant depositors would get higher returns. But raising interest rates also means investment becomes more difficult. And when there’s a contraction of the money supply, economic activity shrinks.

And this, dear readers, is how the Federal Reserve made an otherwise garden variety recession into the Great Depression. FDR, against all the doomsday prediction of his advisors, took the dollar off of gold. It allowed for a speeder recovery and showed, yet again, that our assumption of what money needs to be can cause us to be quite dogmatic about what will happen when the nature of it is changed. FDR, according to Goldstein, stopped the bank runs with his comforting fireside chats. Indeed, when it came to bank runs, it really was that the only fear we have is fear itself. Once everyone thought banks weren’t at risk, they stopped pulling out their money and it became a self-fulfilling prophecy. Yet again, the shared trust in the credibility of the dollar and banks was what gave the system its value.

The history of central banks has shown societies’ delicate experiments with how to best prevent financial crises and the United States is no exception. Alexander Hamilton pushed for a “national bank’ at the country’s outset. Populist Andrew Jackson thought that banks and the coastal elites that ran them had too much control over the inland farmers and countrymen he was claiming to represent. So he got rid of the national bank.

What dominated for decades was a period of ‘free banking’ where any bank could print their own currency. At one point, there were 8,370 different kinds in circulation. A helpful reference book would tell merchants the value of each bank’s bill and, in an effort to prevent counterfeiting, the appearance of each bank’s currencies. It sounds like total chaos, and in some ways it was. The system led to financial panics every decade or so and the US went back to a national bank called the Federal Reserve in the early 20th century for good reason.

But the frictions in the system weren’t as catastrophic as you might think. As Goldstein says, “Travelers typically lost around 1 or 2 percent when they exchanged paper money, in the same ballpark as the fee I pay today when I can’t get to my bank and have to use another bank’s ATM.” Also, perhaps surprisingly, there weren’t all that many totally fraudulent banks. Today, imagining 8,000 different corporations printing their own Monopoly money sounds insane. But it worked better than you’d think!

Today, the Federal Reserve is run by 12 regional banks and it sets short run interest rates on the open market by selling treasury bonds. That might sound like a lot of confusing word soup to someone not totally in the mix of finance and economics. But just know that what the Federal Reserve does today is a little different than what it did pre-Financial Crisis, more transparent than what it did before the 1980s, and much different than what it did when the dollar was still chained to gold before the 1930s.

There is an important distinction between “money” and “currency.” Currency is the coins and bills we can hold in our hands. But money can be numbers on a screen and what banks do. Put another way, there is a concrete tangible amount of currency, but the amount of money in the system is always changing. One of my dollars deposited in a bank can be lent out to someone looking to start a business. That businessperson take the dollar and pays a construction worker to build an office. I still have a dollar, the investor has a dollar, and the construction worker will do something with that dollar. This “money multiplier” is what causes economic activity to thrive, and is in part what stopped China from reaching its full potential during the Ming Dynasty. But this distinction, or at least the ability to have the distinction, is pretty counter-intuitive and seemed primed for disaster for most of history.

Even our definition of a “bank” should be flexible. Today, as with most of history, a bank does two things: takes in and holds people’s money; and lends out money. Why do banks need to do both roles? Money shows a scenario where some institutions are essentially ‘money warehouses’ that store your money, while others are the ones that take risks and lend to borrowers using the money of investors. The mismatch between people wanting to safely deposit their money and the risks banks take is essentially where many frictions in the financial industry lie.

The arrival of cryptocurrencies has shown potential disruption to the idea that only the Federal government can control currency. If our dollar bills have value only because everyone else agrees they do, why would numbers on a ledger be any different? So far, the issue is that the fluctuating value of Bitcoin makes it hard to be used as a ‘storage of value.’ I’ll admit that Bitcoin has been around much longer than I predicted, but it still has too many of the problems of historic attempts at money. Blockchain technology and cryptocurrencies may have some value for society in the future, even it currently looks unlikely to replace the fiat money system we have now. If there’s one lesson about monetary history and Money, it’s that you should never count out changes that at first glance look pretty absurd.

Money has been printed by governments and it’s been printed by private banks. It’s been backed by precious commodities like gold and silver and it’s been backed by nothing. It’s been represented as coin and bills and it’s been represented merely as numbers on a screen. Its confusing nature has caused financial panics even when there was nothing wrong. Our relationship, control, and treatment with the concept of money will continue to change. A book like Money shows us how we should learn to accept and accommodate that inevitable change.

New podcast episode released this morning. “Golden Gates: Housing Affordability in California” features NYT’s Conor Dougherty as he talks about his recent book Golden Gates. Check it out on iTunes or wherever you get your podcasts.

I wrote a review for Liberal Currents of Ezra Klein’s latest book Why We’re Polarized.

Klein sees the election as a culmination of our social psychology mixing with a media landscape designed to outrage, in a political system that incentivizes Republicans to become more extreme. We are hard-wired to protect our identities from external threats, and contemporary political parties have become strong proxies for the groups to which we belong. The media and politicians tap into our psychology that makes us react more strongly to threats and antagonism than to positivity. And the American political system was designed centuries ago to represent geography more than popularity in a way that makes Republican electoral success tied more to extreme stances than winning over swing voters. All of this, according to Klein, leads to “a legitimacy crisis that could threaten the very foundation of our political system.” The book’s claim that political parties now stand in for identities, in a way that leads to more polarization than was common in the 20th century, is convincing. However, Klein leaves important social factors unanalyzed, and there is reason to believe he is presenting current trends as more inevitable than they in fact are.

Check it out.

Leftists looking for a greater chance of making desired change should rally around Elizabeth Warren as their preferred candidate. If the idea is to support a candidate that has an ideal ideology, then Sanders could be the guy. But if the idea is to get policy closest to where leftists think is ideal, Warren is far more preferable.

Sanders may present himself as passing a higher purity test – Warren has said she’s a capitalist, after all. Yet Warren has proven that she is more effective at working within the system.

Warren has been a Senator since 2012 but her biggest legislative impact may have been through the Consumer Financial Protection Bureau. The CFPB was passed after the financial crisis as an effort to stop predatory banking practices and increase transparency in the financial industry. Her campaign is notorious for having thought of policy proposals for everything whereas Sanders relies more on rallying cries that have unclear policy specifics.

Bernie’s legislative accomplishments have been sparse. He has been a member of Congress since 1991, but two of the seven successful bills he has sponsored have been to rename post offices, and one was to commemorate a Vermont ‘bicentennial day.’ With his proposed legislation being so far out there, how different is he effectively than a replacement-level average Democratic Senator? He has admirably stuck to his principles when most Democrats joined Republicans for sub-optimal things like foreign intervention, financial deregulation, or curbing civil liberties. But whereas Biden has some shady history here, Warren is no different than Bernie in this regard.

Noah Smith debated Meagan Day for Bloggingheads in what was billed as a match pitting “neoliberalism versus democratic socialism.” Something that struck me about the more than 80 minutes of conversation was the angles at which each person came at each issue. Noah was well-versed in specific statistics, compared the impacts of various incremental policy changes, and gave a sense of when certain approaches worked better than others. Meagan took a much more birds-eye view of the system, pointing out big picture shortcomings of capitalism and the injustices of many policies.

I found Meagan’s approach to be valuable in a philosophical sense – pondering big questions about systemic realities we take as given that need not necessarily be. Important questions, and true meaningful systemic change comes from the seeds of ideas that originally seem radical. But Noah’s style of analysis fit much more into the “technocratic left” approach to policy. He looks at policy impacts as fitting into the current system rather than the very most ideal system. Meagan learns more about the system and the surrounding world to figure out what the best system would be. Noah learns about the system and tries to answer questions using more incremental change.

I see Sanders operating at a level much more in tune with Meagan – creating a broad philosophical approach to politics that emphasizes theory more than the political roadmap to get there. Warren looks at the problems on Wall Street and says “I’m going to create the Consumer Financial Protection Bureau.” Sanders proposes a lot of bills that, while maybe really great in principle, don’t have much of a chance of even getting out of committee.

The reality is that changes are almost always made on an incremental level within the existing system. The American system is designed to decentralize power, causing change to be slow and gradual rather than revolutionary. It’s not impossible for a Sanders Presidency to fundamentally remake the entire system, but I also consider it very unlikely.

Consider an extreme libertarian or communist that is elected to the village board of their town. The libertarian could have an ideological commitment to privatizing the local schools or removing the country from fiat money. The communist would like to abolish private property and unionize the entirety of the working class. But in these positions, the local residents just want to make sure the potholes are filled and the garbage is taken out. If the libertarian dies on a hill of “all local spending is unjustified except for maybe a small police force,” they’re going to be left out of conversations where there is actual discretion of public spending.

So the ideological difference between Sanders and Warren is moot to me when you consider that they would be President, not benevolent autocrat. They’ll be met with a resistant Congress and moderates of their own party that will not allow their most progressive initiatives to pass. The areas where President has the most relatively unchecked power – things like foreign policy and regulation – are policy verticals where Sanders and Warren aren’t very different from each other.

I should also note that I consider both Warren and Sanders to be incredibly principled and consistent in their views. They have demonstrated to be champions for their causes over a long period of time, and are not corrupt or likely to compromise their values. They are not faux-progressives, and the discussion about whether Warren previously being a Republican compromises her consistency is really stupid.

Trump has shown that people will rally around their party and its leader to an incredible extent when it comes to policy preferences. Republican voters’ views during the Trump administration have been revealed to be incredibly malleable. Suddenly, Russia is not the bad guy and tariffs are great. It could be said that a Sanders platform, by merely being the stated policy platform of the Democratic Party’s highest elected official, would shift the views of half the country. There’s some truth to this. Maybe the Overton Window would dramatically change and instead of the parties nitpicking over the nuances of the Affordable care Act, the discussion suddenly becomes between varying degrees of Medicare for All.

But I’m still drawn to an emphasis about who is more likely to cause change. Despite not passing some leftists’ purity tests, Warren would still be the most progressive President ever. The differences between her and Bernie ideologically don’t seem relevant to me compared to her success and approach at actually changing the system.

As a much more moderate person, Warren and Sanders are not my preferred Democratic candidates. But I think for people holding views on the farther left end of the spectrum, Warren should be the candidate they push.

I reviewed “Capitalism, Alone” by Branko Milanovic for the Liberal Currents website.

To supporters of the market economy, the fall of the Berlin Wall nearly thirty years ago was supposed to mean the undisputed triumph of capitalism. But tensions today from increasing inequality, the rise of populism, and the remarkable growth of China have thrown this foregone conclusion into doubt. Did capitalism’s supporters take a premature victory lap? In his recent book Capitalism, Alone, Branko Milanovic argues that while we must resolve some of capitalism’s internal contradictions and countries like China show there is more than one recipe, capitalism is here to stay.

And:

In the end, Milanovic’s greatest contributions in Capitalism, Alone come from his fresh approach to the history of different capitalist countries. His taxonomy of Western countries evolving from classical, social-democratic, and now liberal-meritocratic capitalism helps us put the current state of affairs into better context and think about the ways policy can and cannot improve the system. While he is overconfident in political capitalism as a dominating force in global politics and a sustainable alternative to liberal capitalism, his analysis of the forces and magnitudes of different kinds of inequality give a more nuanced story than is often found in public discussions.

Check it out!