Reimagining the Corporate Form: Toward a More Democratic System of Corporate Governance

This entry was also posted at the HLPR Blog: Notice & Comment.

Occupy Wall Street has, in the words of John Paul Rollert, “come to embody a common sense that something is wrong with American capitalism.” The problem Rollert points to is not with capitalism itself, but with a particular American version that has ceased to work for broad cross-sections of its population. Given America’s Depression-level income inequality and near-record levels of private indebtedness, it is extremely tempting to focus on bad outcomes as the problem. But the real issue is that many of the economic and political structures that we take for granted repeatedly produce unequal, undesirable outcomes. If reformers seek to make American capitalism more inclusive, the focus needs to be on fixing these structures and getting the rules right.

It has been a steady mantra of Occupy Wall Street not to make demands of existing political leaders and institutions. But as Matt Langer explained, “the reasoning behind not making demands most certainly does not preclude making demands of our collective imagination.” Whether people prefer to work within existing structures or not, the next essential step is to understand how broken institutions and flawed incentives created this mess and to start imagining what structures can be built in their place. Where better to start than with corporations?

Current Governance Structures and Their Shortcomings

Consider the role that our system of corporate governance has played in producing some of our current imbalances. Excessive risk-taking, stagnating wages, and the spike in executive compensation can all be linked back to a system of corporate governance that privileges management’s interests at the expense of other actors.

It’s by no means an original observation to say that boards are under the sway of management. Indeed, the US is something of a global outlier in allowing a business’ president/CEO to appoint its board of directors, and in some cases the president/CEO actually serves dually as the chair of the board. Not only is the composition of the board not reflective of its owners, employees, or investors, boards are only subjected to a relatively relaxed legal standard. As a result, directors often find that their interests (i.e. staying on the board) are best served by taking a passive role and letting management make most of the choices. In light of this structural failure to limit conflicts-of-interest, it should be unsurprising then that the interests of employees, shareholders, and other stakeholders are, at best, secondary to those of executives. As Harvard Law Professor Mark Roe succinctly phrased it, “the US is managerialist, not capitalist.

Current governance arrangements have had an enormous impact on the larger economy and on the distributive features of American capitalism. To begin with, the existing corporate governance system (in conjunction with other regulatory failings) has proven inadequate to keep excessive managerial risk-taking under control. Despite the Enron disaster, the fall of Bear Sterns and Lehman Brothers, and the near-collapse of many of America’s overleveraged financial firms in 2008, we appear to have done nothing to address this issue. These risk-induced failures were repeated last week in the near-overnight fall of MF Global. As though nothing was learned, the star-studded MF Global board sat by and, in Steven Davidoff’s words, “gave executives []free rein to take tremendously risky bets that brought the house down.

In 2008, Martin Lipton and his colleagues at Wachtell prepared an excellent memoranda on boards’ responsibility over risk-management which was posted at the HLS Forum on Corporate Governance. In discussing the legal framework for risk-management, they advised corporate boards to go beyond the minimal requirements created by the leading state law case, In re Caremark. Nonetheless, this is how they summarized the state of the law: “These cases demonstrate that it is difficult to show a breach of fiduciary duty for failure to exercise oversight; these cases do not require the board to undertake extraordinary efforts to uncover non-compliance within the company.” Federal laws like the Sarbanes Oxley Act do require auditing and increased oversight from the board, but the overall implications remain: the decision-making center of gravity remains largely with executives, whose personal incentives to post short-term profits can fuel excessive risk-taking, and current law gives boards few incentives to keep that risk-taking in check.

The problem is not just that boards are passive and deferential, but that those who want risk limited cannot make themselves heard. These high-risk strategies often run counter to the interests of other stakeholders, including bondholders and shareholders, whose interests are not reflected in the board’s composition and thus are not sufficiently represented. The idea that the broader public or the employees whose jobs are on the line would have a say is, under current thinking, not even a remote possibility.

The resultant proximity between Boards and management has a lot to do with runaway executive pay. Board members usually have a stake in their position, and because they are appointed by management, it’s often not in a director’s interest to start ruffling the CEO’s feathers. As Lucian Bebchuk and Jesse Fried argue in their excellent book Pay Without Performance, “structural flaws in corporate governance have produced widespread distortions in executive pay.” Their argument, briefly, is that boards have too many incentives to go along with management and are therefore unable to contract with executives at arm’s length. This broken feedback loop is at the root of the ridiculous pay packages, bonuses, and golden parachutes we’ve seen over the past decade.

The wage stagnation that’s affected the remainder of the workforce shares a common origin: all stakeholders other than executives are systematically excluded from decisions that determine compensation. The fact that corporate profits remain at near record highs suggests that the problem is indeed structural and not attributable simply to changes in the labor market. The absence of a voice for employees either in management or on the board of directors, in conjunction with weakening collective bargaining rights, means that the record profits businesses have been posting get funneled mostly to executives and do not translate into gains for the average American worker. The rules that determine who gets to cut the pie, in other words, have a lot to do with the fact that CEOs went from making 24 times what the average worker did in 1965 to making 185 times as much in 2009.

Ratio of CEO compensation to of average worker’s compensation.
Source: Economic Policy Institute, 2011, via SCSPI.

More Inclusive Alternatives to Minority-Rule Governance

Corporations do not have to be organized in this way in order for the private sector to prosper or for the economy to grow. Recent events should make it clear that keeping down transaction costs is not the only concern here. A number of compelling alternatives exist. I start with the more moderate reform proposals and conclude by proposing that we look to the German corporate model or other structures that afford investors and employees a role in a company’s management.

Calls are frequently made to enhance the role of shareholders in decisions involving executive compensation and risk-management that happen at shareholders’ expense. Bebchuk and Fried have argued that it’s possible to improve transparency and accountability by giving shareholders a greater say on pay, by strengthening shareholders’ ability to unseat and replace directors, or by increasing the number of independent directors (i.e. directors not employed by or doing business with the company). Another proposal they describe would allow shareholders the ability to amend the corporation’s charter. Any long-term solution to these agency problems entails providing investors and owners with a permanent vote or some structural role in decisions that affect them.

An increased role for employees is also necessary to prevent some of imbalances that have arisen between management and the average member of the workforce. Randall Thomas and Kendell Martin, for example, have argued that labor unions and related entities should be allowed to make shareholder proposals. It would be possible to go even further by affording both investors and labor a role on the board and a larger say in major decisions that affect a company’s future. This is precisely what the German corporate governance system does. The German Codetermination (Mitbestimmung) system provides employees a role in the company’s management and has proven remarkably successful across a number of economic sectors. And although German income inequality has grown in recent years, “income inequality in Germany is a long way from reaching US proportions.

I point these out not to advocate any particular corporate form, but to observe that there are alternatives that can address failings of the existing system. It’s important also to observe that things were not always this way. The internet has fostered an explosion in new forms of social organization, and cooperative membership structures are another potential source of ideas. There’s no reason that running a successful business means accepting a one-size-fits-all corporate model, particularly when that model marginalizes a company’s most committed participants—its investors and its employees.

Capitalism isn’t a single thing or a system of natural laws. It is a system whose rules are shaped by political—and ideally democratic—choices. Nowhere is this more obvious than in the reified legal fiction of the modern corporation. The absence of democracy within corporations is a central reason that the US has seen such a proliferation of high-risk investment strategies, and an unprecedented divergence in incomes. The concerns of both investors and employees have been systematically subordinated to the interests of America’s managerial class. The failure to create an inclusive economy is fundamentally a failure to build inclusive institutions. And the first step to fixing this problem is remembering that the rules that govern institutional decisions can be different.

Short-termism is a kind of contagion

An emphasis on short-term performance does not always produce a long-term viable strategy. That looks obvious enough when typed out. But short-termism has become the prevailing logic of many American institutions, none more radically than those institutions that make up its financial sector. As Sheila Bair noted in an op-ed on her last day as chairman of the FDIC, our media, political institutions, and businesses fall victim to this tendency, and it has begun to undermine our long-term stability.

To persist in acting this way would require a kind of insane faith that what’s good for now is good for tomorrow. It’s the story of how every tragedy of the commons ends badly. These attitudes, however, persist less because of any rational deliberation than because institutions can easily devolve toward incentives that reward short-term results.

One principal reason for this is that short-term strategies have a tendency to spread. In a way, this might even be a more general feature of unrestrained competition. I want to make a slightly tenuous comparison to evolution before returning to the more general point.

Assume there is a small plot of land with two unrelated breeds of plant. If plant A can absorb soil nutrients faster and outbreed plant B, it will proliferate and might eventually displace plant A entirely. This is known as exploitation competition. The evolutionary pressures on A become: 1) either depend on fewer nutrients, 2) develop a some alternate replication strategy, or 3) simply beat B at its own game, by reproducing faster and extracting nutrients more quickly. The last of these is the one I want to emphasize: Short-termism is self-reinforcing and it is contagious. When B’s reproductive strategy is on short-term success, it redefines the game for A. Eventually it becomes the only game left. This is what kudzu did when introduced in the southeastern United States. Taken to an extreme, quite literally, this is the logic of cancer.

In other words, if B chooses to play a shorter-term game than A, that redefines the game A must play to survive. Market competition is also susceptible to this dynamic, and in many ways it may account for some of markets’ successes. The process can weed out under-performers and produce more efficient manufacturing processes. But it also weeds out other business models that under other conditions would be perfectly viable and sustaining.

This fact alone should also provide a compelling reason for market regulations—something I’ll write about another time—but this dynamic also means the following: If a business starts engaging in rent-seeking activities (i.e. attempting to influence government into creating a legislative or regulatory playing-field more favorable to its interests), then quickly other competitors, other businesses, and even entire sectors may be forced to follow suit.

To make the link now to the financial sector: Incentives in the financial community have become tied closer than ever to short-term performance. Such incentives have the potential to reward speculation, and the 2008 crisis revealed that these incentives have the potential to reinforce bubble-generation. Extreme short-termism redefined the terms of competition. It drove firms that emphasized longer-term performance and responsible practices to obscurity and irrelevance, and it drove many organizations into riskier positions to remain competitive (e.g. the decision at Fannie Mae to get into subprime mortgages in 2007). It should not be surprising that given the ways that incentives were linked to performance that the terms of competition became what they did.

Nor is it surprising, as Bair notes, that short-termism has come to characterize many of Wall Street’s interactions with Congress and other regulators. Rent-seeking through lobbying and other activities directed toward obtaining a favorable regulatory playing-field have now become part of the ways that businesses in America compete. To take one easy example, provisions of Dodd-Frank that were seen as restraints on business were cut, watered down, and those that were left in have been implemented half-heartedly. And that happened despite a general consensus that Dodd-Frank was not aggressive enough in providing the US the framework it would need to respond to another financial crisis.

[Legislators are now plagued by a similar dynamic of having to fund-raise to keep up with each other, with a short-term focus on reelection rather than on governance. This further exacerbates the influence Wall Street spending can have].

Given the various ways in which Wall Street successfully defeated attempts to impose new regulations after the 2008 crisis, we would expect that the financial sector would be well-positioned for the coming decade or to handle another crisis. But this hardly seems to be the case. The shadow banking system, probably the single largest accelerator of the crisis’ spread remains largely unregulated. Banks are fighting tooth and nail against hightened capital requirements. And the fact that large financial institutions pushing for austerity measures is so shorted-sighted as to ignore any possible interdependence between growth and a healthy middle class. Etc. etc.

Perhaps the most insane thing about all of this is that large financial institutions and proponents of deregulation are so short-sighted that they believe this kind of game is actually serving their interests. [Or maybe the game is just to be the last one standing?]

To quote a post at Digby’s blog about Murduch’s ability to rapidly corrupt the WSJ, one of the world’s “most important sources of financial news”:

I think this may be the best sign yet of just how crippled our institutions have become. If there is one group in the world who should demand unadulterated facts and data it is the financial community. Sure, they’ll play it to their advantage, and care not a whit about how it affects our democracy. That’s not their job (although it is their duty as citizens.) But they simply cannot function properly if their information is tainted.

The ‘invisible hand’ produces races to the bottom just as often as it produces self-regulating systems. We have failed utterly to keep the terms of this game from keeping this short-term contagion in check. When that happens, even the winners are at risk.

Photo credit: Galen Parks Smith.

Debt: The First 5000 Years

This is just a short follow up on my last post, to observe that there’s absolutely nothing new about the dynamic underlying the economic policy of the United States or the resistance to the austerity packages we’re seeing across Europe. I just ordered David Graeber’s excellent-sounding book, Debt: The First 5000 Years. If this summary is any guide, Graeber is suggesting that debtor-creditor relationships predate currency-based markets and are one of the fundamental organizing devices in human history:

Every economics textbook says the same thing: Money was invented to replace onerous and complicated barter system—to relieve ancient people from having to haul their goods to market. The problem with this version of history? There’s not a shred of evidence to support it.

Here anthropologist David Graeber presents a stunning reversal of conventional wisdom. He shows that for more than 5,000 years, since the beginning of the agrarian empires, humans have used elaborate credit systems. It is in this era, Graeber shows, that we also first encounter a society divided into debtors and creditors.

With the passage of time, however, virtual credit money was replaced by gold and silver coins–and the system as a whole began to decline. Interest rates spiked and the indebted became slaves. And the system perpetuated itself with tremendously violent consequences, with only the rare intervention of kings and churches keeping the system from spiraling out of control. Debt: The First 5,000 Years is a fascinating chronicle of this little known history—as well as how it has defined human history, and what it means for the credit crisis of the present day and the future of our economy.

UPDATE (11/7/11): I wrote a review that was published in Guernica a few weeks back.

The Debtor-Creditor Divide

Several weeks ago, Bob Kuttner published a short piece called “Debtor’s Prison” in the American Prospect. The distinction he offers now seems quite clearly to be one of the emerging battle grounds in global politics–between rentier creditors and debtors. This is a line that’s deeply obscured in our political discourse but one that underlies virtually every economic debate. Reading this article and Paul Krugman’s follow-up offered one of those rare, paradigm-shifting moments where a number of seemingly disparate and complicated elements all fell together into one coherent picture.

The basic idea is that decades of U.S. financial deregulation and the government’s response to the financial crisis have systematically favored the claims of creditors and transferred the losses and downside of their risk to taxpayers, homeowners, and less sophisticated borrowers. To quote Krugman, “everything we’re seeing makes sense if you think of the right as representing the interests of rentiers, of creditors who have claims from the past — bonds, loans, cash — as opposed to people actually trying to make a living through producing stuff.” I also recommend Yves Smith’s post on the costs of rentier rule.

Geithner, Summers, and most of Obama’s financial team have done little to alter this development. Charles Fergesun, the director of Inside Jobhas drawn attention to regulators’ unwillingness to see creditors take even minor haircuts. This has proven true even in egregious cases such as AIG’s toxic credit default swaps in which creditors cashed in at 100 cents on the dollar during the bailout. As Shelia Bair, the Bush-appointee who recently stepped down as Chairman of the FDIC, noted in her wonderful farewell op-ed in the Washington Post, there were few prominent regulators advocating that banks write down losses for the bad mortgages and other bad borrowing. Quantitative easing and the Fed’s decision to keep interest rates near zero have likewise served as a second stimulus for large creditors but have not translated into economic growth or more jobs.

As Tim Harford argues in his book, Adapt: Why Success Always Starts With Failure, we need individuals and businesses to take risks because success is an iterative, evolutionary process of failing and building off of what works. But when we punish borrowing so harshly and reward rentiers uncritically, we destroy incentives to innovate and instead encourage speculation and bubble formation. Getting people above water in their homes and allowing businesses to take loans on favorable terms is precisely what we need to stimulate demand and begin an economic recovery. Even Goldman Sachs acknowledged that this job crisis is a problem caused by too little aggregate demand.

But instead the response has been more of the same. In order to stay in creditors’ good favor, Europe is facing what feels like an endless series of sovereign debt crises and austerity measures forced on its population, and the U.S. is quickly throwing itself down the same rabbit hole. By imposing austerity measures and penalizing borrowers, policy-makers risk creating an entire class or an entire generation that’s too indebted to innovate, move, or seek additional training — creditors need to wake up and realize they’re not getting paid this way either.

Perhaps the most frustrating aspect of this entire situation is that because our political system is so closely tied to its own creditor class, this conversation is completely absent from the current fiscal debates. As Peter Dorman rather tragically observed:

“There are lots of interesting, complex issues in political economy. None of that matters now: the world is in the hands of politicians governed by expediency calculations whose time horizon can be measured in weeks. As far as I can tell, the gross illogic of their policies is simply beside the point.”

Rethinking the Faith: A Review of Richard Posner’s A Failure of Capitalism

This book review was originally written for the Harvard Law & Policy Review Online.

Richard Posner, A Failure of Capitalism: The Crisis of ’08 and the Descent into Depression, Harvard University Press, 346 pp., $23.95

Richard Posner’s A Failure of Capitalism: The Crisis of ’08 and the Descent into Depression is about a macro-economic crisis. It is also a surprisingly inward-looking book. Richard Posner, a judge on the Seventh Circuit Court of Appeals and a law professor at the University of Chicago, has been a prominent figure in the law and economics movement, an effort to bring insights from economics into legal and policy discussions. The sudden economic crisis in the fall of 2008 clearly took Posner, like most Americans, by surprise. Any reader familiar with his reputation in law and economics will see that the 2008 crisis has forced Posner to challenge some deeply held ideological beliefs.

Posner’s disappointment with the economics profession in light of the 2008 economic crisis is understandable. The profession’s unanimous failing in missing the housing bubble strikes at many of the principles that Posner had built his career around. In A Failure of Capitalism, Posner begins looking in new directions and strives to wrap his head around the crisis: how did the market, the government, the professionals, and the experts all get this so wrong? For a laissez-faire Chicago economist and the leading intellectual force behind the law and economics movement, Posner’s questioning of his own convictions represents an enormous turning point in the country’s intellectual climate. His book is a testament to his versatility as a thinker and to his willingness to confront the 2008 crisis with the appropriate amount of seriousness and self-criticism.

A Failure of Capitalism is readable, thorough, and unforgiving toward all of the parties involved in producing the crisis. No actors are left out—the banks, the sub-prime lenders and borrowers, the Fed, the lame-duck Bush administration, the executives with the wrong incentive-structures, the palpably absent regulators, the American consumers who borrow endlessly and never save a penny, the risky new financial instruments, decades of laissez-faire faith in markets, and most of all, former Federal Reserve Chairman Alan Greenspan and low interest rates—no one gets a free pass.

This interconnected array of actors and complicated financial instruments does not lend itself to a linear narrative. At times, Posner errs toward over-inclusiveness instead of orderliness, and he frequently veers off track with asides and peripheral economics lessons. Yet through the frequent attempts at self-summary and the regular repetition of themes, a coherent non-technical account of the crisis begins to emerge.

The timing of A Failure of Capitalism is perhaps the clearest insight into its shortcomings. Posner attempted to assay the destruction of the financial crisis before the dust had even settled, making the book feel as though it were written hurriedly in order to pull off a May 2009 release. This rushed quality shows, both organizationally and in some of Posner’s bolder claims that have failed to pan out. In particular, it is hard to reconcile Posner’s eagerness to label the crisis a “depression” and “a failure of government” with his admonitions that 2009 was too soon to initiate regulatory reform. There is much the book does well, however, and its contents deserve public airing. Unfortunately, it is too unkempt and alarmist to serve as more than a starting point for a more deliberate discussion of this crisis. Indeed, less than a year later, Posner has already published a second book on the topic, The Crisis of Capitalist Democracy.

A Failure of the Market and an Absentee Government

Posner’s vision of what went wrong centers on low federal interest rates and leverage—the heavy use of debt to supplement investments. A decade of cheap credit encouraged borrowing of all kinds, fueled spending, and pushed investors toward higher-return, riskier financial products. It was not uncommon for institutional investors to be leveraged as much as 30-to-1 on investments, meaning that for every dollar they invested, they were investing another 30 dollars of borrowed money. When this risk was aggregated into a small number of elite financial institutions, the first signs of instability were enough to send shockwaves through the economy. While Posner paints a somewhat more disjointed picture, I have attempted to trace a line through what he calls the proximate causes of the crisis.

Above all, Posner blames (in a chapter unflinchingly titled Apportioning Blame) the Federal Reserve and its former chairman, Alan Greenspan, for setting interest rates dangerously low for an unprecedented span of years. By setting interest rates low, saving became less valuable for consumers and investors, thus encouraging excessive spending and borrowing throughout the economy. The traditionally steady housing market absorbed much of the excess credit as people sought out investment properties and an increasing number of people became first-time homeowners. The growth in the housing market pushed mortgage brokers, urged on by financial institutions, to extend mortgages to riskier (sub-prime) borrowers and to offer complicated products like adjustable rate mortgages that became more expensive over the life of the loan. This brought even more people into the housing market, thus fueling further price increases, and produced what is now well understood to be the “housing bubble.” As it became evident that many of these borrowers were in houses they could not pay for, demand leveled off and home prices started to fall. With no market demand, people were stuck with homes they either needed to flip or simply could not afford, and they started defaulting on their mortgages.

The bursting consumer housing bubble quickly spilled over into the financial markets, where banks and other financial institutions had invested heavily in new financial products tied closely to the consumer housing market. Relying on credit agency ratings and using historic models of mortgages as safe investments, financial institutions staked billions of leveraged dollars on mortgage-backed securities. These securities were comprised of groups of mortgages, whereby investors would receive returns as payments were received on the underlying mortgages. Posner emphasizes that it was not mortgage-backed securities themselves that produced the fundamental instability of the financial institutions. Leverage and the extent to which institutions had become interdependent were far more precarious. If the price falls far enough, the ability to repay that borrowed capital is called into question, which in turn makes lenders less likely to recover on their loans. The risk inherent in such an arrangement can spread quickly throughout the entire financial sector. Like a sinking ship, one failing firm can quickly pull down everything in its proximity.

When consumers started defaulting on their mortgages, mortgage backed securities quickly declined in value and investors rushed to unload them. With no market to buy the securities, these investments quickly became unpriceable and soon earned the name ‘toxic assets’. Investment banks like Lehman Brothers held vast sums of these securities, previously worth billions, that were suddenly hard to price. This rendered the entire value of the company uncertain and set off a cascade of investors trying to liquidate their investments for cash, producing essentially a bank run on Lehman Brothers. Lehman was forced into a fire sale and could not scrounge up enough cash to cover its debts quickly enough. This quickly turned into a chain reaction. Other institutions dealing with Lehman were unable to call in their investments and suddenly confronted a similar threat. If they didn’t get rid of their mortgage-backed securities and get their money from Lehman, they wouldn’t be able to cover their own debts. The whole financial system lurched.

This story is, as Posner insists in numerous passages, a series of market failures enabled by an absent government. Although the government might have contained the damage by preventing the Bear Sterns or Lehman Brothers collapses, Posner sees untrammeled markets as the root cause of the disaster. Even in evaluating the government’s culpability, the central failing is the dearth of political will and an ideological faith in markets. Washington left Wall Street relatively free to set its own rules, and many preexisting rules were stripped away during the 1990s and early 2000s. Posner also finds fault with the government’s unprepared, improvised response to the collapse of both Bear Stearns and Lehman Bros. But according to Posner’s version of the story, the government’s primary failing was its glaring absence—an absence which enabled a race to the bottom as market actors invented new, riskier ways to gamble.

In retrospect, according to Posner, the 2008 crisis could have been averted, or at least mitigated, in any number of ways: higher federal interest rates, enhanced capital requirements for institutional investors, limitations on leveraged investments, derivatives regulations, more stringent mortgage requirements, compensation packages that do not incentivize excess risk-taking, etc. Unfortunately, prior to 2008, hardly anyone seriously contemplated any of these options. The government failed to provide a sufficient regulatory regime, but investors and academics were similarly blindsided by the crash. Faith in the infallibility of markets provided the backdrop to the entire calamity.

Posner writes, “[T]he depression has shown that we need a more active and intelligent government to keep our model of a capitalist economy from running off the rails.” It is hard, however, to know exactly what Posner has in mind. Any call for a more active government role is undermined by Posner’s own concerns about the costs of regulatory reform and the book’s refrain that “a depression is not the right time for regulatory innovations beyond the bare minimum essential for recovering from the depression.” My view is that we should be done by now with the idea that a market without a government is possible. Such a thing has never existed. The more appropriate question is how we should balance the two. We need to learn how we can use government to encourage healthy market growth while preserving stability and innovation.

An Ideological Failure

While never excusing the oversight, Posner acknowledges that there are plenty of plausible reasons why nobody, not even the economic profession, saw the crisis coming. Our frame of reference in a market system tends to be on the winners and losers only in the very recent past. Even the most sophisticated computerized models lacked historic data about many of the forms of risk being traded. Many models accounted for the possibility of individual defaults, but few considered the possibility of mass foreclosures or other forms of system-wide risk. But academics and financial analysts were not the only ones looking in the wrong directions. The Republican Party had invested politically in a free-market ideology, and fiscal conservatism was becoming increasingly common even among political moderates and socially liberal Americans. Ideologically, the entire nation was blinded by its preconceptions about the ability of free markets to self-correct.

The dearth of political will in Washington, while blameworthy in hindsight, can be explained at least partially by our democratic process. Americans were doing well financially, and there had not been a serious financial collapse in nearly 80 years. Wall Street was coming up with innovative investment products faster than Congress or a dedicated agency would have been able to follow. The SEC, though understaffed, failed on its duties to police corruption, nowhere more evident than in its failure to prosecute the Madoff ponzi scheme that had been brought to its attention on several occasions. Neither the financial professionals with billions at stake nor the relatively more insulated economics experts anticipated any system-wide threat. Economic regulation had almost no political salience, and any connotations it did have were mostly negative.

Even to the extent warning signs were present, our political and financial incentives are currently structured in such a way that there is little to be gained by spotting a bubble early. As Posner writes, “virtually all warnings are premature.” Until a bubble is at its crest, it is still possible for rational economic actors to make money in the short term. Particularly when everyone else in the economy is profiting off of a growing bubble, the pressures to ride the wave are overwhelming. This, I believe, provides one of Posner’s most devastating critiques of our market arrangements and one of his strongest cases for governmental regulation. Information cascades and the absence of regular feedback have given us an economic system that regularly generates bubbles that can only be identified in retrospect. If investing in a bubble is rational from an individual investor’s standpoint, we need a mechanism for pricking them before they grow large enough to create systemic risk.

Posner notes that there is a related problem in the political and academic arenas that make them similarly unlikely to identify bubbles in advance. There are inadequate incentives for spotting or preventing a crisis before it happens—if an intervention prevents the crisis from happening, nobody knows how bad things would have been had the intervention not taken place. Posner writes, “The point I want to emphasize is that it is very difficult to receive praise, and indeed avoid criticism, for preventing a bad thing from happening unless the probability of its happening is known.” If someone had proposed policies that would have averted the “housing bubble,” it is very likely that they would become politically unpopular, perceived as anti-competitive or as offering only theoretical benefits.

In one of the most provocative portions of the book, Posner argues that what went wrong was not caused by investors or consumers behaving irrationally. He maintains that the financial crisis was a result of ‘rational’ market actions within a poorly regulated economic structure. The distinction is important for a few reasons. When investors behaving rationally in pursuit of profits still produce catastrophic bubbles, the problem is structural and not attributable to any particularly greedy or wayward actors. The inability of markets to stop this process is a typical common resource problem, when individually rational decisions become collectively irrational. This is a clear instance of a predictable market failure and a standard case for government intervention. If bubbles are an inevitable byproduct of rational actors within our market economy, then absent appropriate regulation, it is only a matter of time before the next crisis hits.

Posner is less forgiving of the economics profession, though he does offer several reasons that most economists missed the warning signs. It has been a long time since the country last confronted the possibility of depression. Moreover, the trend toward mathematical and statistical analysis in economics has not produced much fruitful research about depressions, since very little historical data is available. In addition, fragmentation within the profession has encouraged tunnel vision, as financial economists rarely communicate with macroeconomists in their work and often do not even write with the same vocabulary. Political and intellectual factionalism within the profession has also disrupted consensuses and made it more difficult for policymakers to obtain neutral information about the economy. But above all, according to Posner, economists fell victim to the same ideological preconceptions that blinded those in government and the private sector—their faith in the functioning of free-markets meant that many economists simply could not imagine the possibility of an economic crisis.

Posner praises the lone economists, such as Nouriel Roubini, Raghuram Rajan, Dean Baker, and Paul Krugman, who spotted the housing bubble and predicted the financial crisis before it hit. But without consensus among economists, these warnings were too scattered to have the necessary effects on prominent actors in government or the banking industry. The social costs of economic crisis or a protracted recession are unacceptably high, and the fact that something of this magnitude could go largely undetected delegitimizes the entire economics profession. Posner suggests a more centralized way for information about the economy and the financial sector to be collected and analyzed so that economists and regulators can begin to contemplate the way risks interact and accumulate to affect system-wide stability.

Depression Obsessed

Posner’s eagerness to diagnose the economic and financial crisis of 2008 as a full-fledged depression provides one of the book’s most curious features. While in 2008, the financial crisis may have looked like a bottomless pit, a year later the worst of the crisis does appear to be over. At a time when policy-makers were terrified that the very mention of the word “depression” might turn into a devastating self-fulfilling prophecy, why was Judge Posner so impatient to label the 2008 economic crisis a depression?

Posner’s fear of depression stems from the fact that, like the Great Depression of the 1930s, the 2008 economic crisis originated out of a crisis in the financial sector. A financial crisis creates a lending freeze that makes it difficult if not impossible for the Federal Reserve to stimulate the economy by expanding the money supply. At the time Posner was writing, it was still unclear that the bailout was enough to ensure that banks would start extending credit to one another or to anyone but the most qualified borrowers. Without banks lending, consumer spending would decline, demand would fall, and the market would quickly enter a deflationary spiral. This situation, fortunately, has been averted, but at the time Posner was writing it did not seem like the most unlikely of outcomes. There is something ironic about this stance, however, considering that Posner spends an entire chapter arguing that it is too early to regulate or inject the government into the market. Consider this passage:

It is a temptation, but I think it would be a mistake, for the new [Obama] administration to try to emulate Franklin Roosevelt’s astonishing first 100 days. The United States fortunately is in less desperate straights today and American government and the American economy, and specifically the American banking system, are all immensely more complex than they were in 1933. [...] Let the comprehensive structural solution await calmer days.

There may be something to the view that the most sensible thing to do right now is to see how things unfold, but the view is undercut by Posner’s own doom-saying elsewhere in the book. One could understand the urge to be on the right side of history by being the first to spot a depression. But if that’s the stance Posner wants to take, it’s a little disingenuous to hedge with a mild endorsement of the bank bailout and nothing but wait-and-see policy suggestions. To his credit, Posner acknowledges in his introduction that the spring of 2009 was too early to fully assess the economic fallout precipitated by the financial and economic crisis. It is unfortunate that he so quickly dispenses with this humility. But then again, Posner’s lack of humility is probably what made this book possible on such short notice.

Far more harmful, however, is Posner’s insistence that it is not appropriate to press for enhanced regulations mid-recession or mid-depression. Not only is he demonstrably wrong to characterize all possible regulations as market-destabilizing, his call for regulatory postponement ignores a basic political reality: the further we get from the economic crisis, the less likely any government regulation will ever get passed.