by Robert Teitelman | Published December 22, 2010 at 1:52 PM
It's been two years now since the long, dark winter of 2008 and 2009, when the financial world tilted on its axis and threatened to crash. It didn't. We know that now, but that knowledge offers scant comfort for those who still suffer from the fallout or continue to deny there was a crisis at all, or at least one that required such extraordinary state intervention. The last few years have experienced an amazing outpouring of commentary, debate, invective, blather and, yes, some serious reflection about the crisis, what caused it, who was responsible for it and how can we fix it so it never never never happens again. Or at least not for a while.
Having emerged, like a shaggy dog from a muddy pond, what have we learned? I have my own thoughts on all this, fueled by this outpouring of commentary. In some ways the very phenomenon of that commentary, particularly the books and blogs, is part of the larger story; bubbles are, after all, fundamentally problems of information and interpretation. I began this post in the hopes of trying to extract some conclusions from a now-long shelf of crisis books. It quickly became apparent that wrestling with what those books accomplished, or failed to accomplish, led into deeper thickets where historical origins and causation lurk. I've tried to follow that wandering path here. A warning: This goes on for a bit, so if you're interested, pour yourself a festive drink and settle in.
The first thing to say about following this crisis in a daily fashion is that blogs can suck up a vast amount of your day. There are a lot of them. And when you combine the posts, the magazine and newspaper articles and the books, they tend to merge into a mass of overheated verbiage. They are immediate, to be sure, and on the cutting edge. But they also age like day-old lettuce. Soon after the crisis began, Judge Richard Posner not only began to blog about the affair, but also then produced two books a year or so apart that analyzed the various issues then at hand with alacrity. Both "A Failure of Capitalism" and "The Crisis of Capitalist Democracy" were compelling reads at the time -- OK, for the wonks among us -- and notable for their high seriousness and sophistication. Particularly in "The Crisis," Posner unreels any number of sharp insights about shareholders, regulation and economists. But he was also clearly writing as fast as he could -- how could he not? -- with one eye cocked to whatever disaster was taking place. Despite the titles, which suggest conclusions for the ages, both tomes resemble briefing books (or, not surprisingly, judicial decisions) more than deep reflection. Posner's value, and it was not inconsiderable, was to disassemble the various issues in a serious way, not answer them for the ages.
The good news here is that the books generally have grown more reflective over time. Books like Bethany McLean and Joe Nocera's "All the Devils Are Here" profit from the need to sharply focus on a theme -- they take apart the real estate bubble -- and from all the rest of the reporting and writing that's gone on before. They can assume a longer view; they can synthesize more effectively. Their book seems to be selling well, but the problem, of course, from a publishing perspective is that just as we reach a point where serious thinking can take place, the audience may well be exhausted on the subject.
And yet we have gotten to such an interesting place -- "interesting" employed with all its ambiguity intact. Again the question: What have we learned? Well, the obvious. There was too much leverage, too much complexity, too much size, too many interconnections. We should have paid more attention to derivatives and securitization. Real estate was a vast bubble, inflated by a combination of traditional greed and new-age innovations. Both risk management and regulation were captured, declawed and turned into kittens. Deregulation was part of the problem, but exactly which parts were most lethal remains an open question. The New Deal regulatory system began to break down decades ago, but accelerated. There were good reasons for that breakdown, and bad. Globalization contributed to the mess in a variety of ways. Clearly there were significant imbalances that allowed weak political and financial leadership in the U.S. to believe that Americans, and the U.S. government, could endlessly borrow from those nice Chinese. And, oh yes, there was the Greenspan error: Markets, he was shocked to discover, are neither all knowing nor all wise.
Few would debate any of those statements, though they might emphasize deregulation over, say, trade imbalances, or Fannie and Freddie over Glass-Steagall. Many still argue for single-cause theories: Compensation was the original sin, or credit raters, or the ever-popular plutocratic conspiracy, which runs from Johnson's notion of a bank takeover of the state to Matt Taibbi's depiction of Goldman, Sachs & Co. lurking behind every bubble since the Great Depression. There are fragments of truth (and falsehoods) all over the place. Put them all back together again, and we end up with a very different view of the crisis, one that is, alas, shot through with complexities and ambiguities, one that lacks good guys and bad guys, and thus one that is about as politically compelling as raising taxes. On the other hand it's probably closest to the truth, which raises a variety of questions about the relationship between the polity (this being a democracy), the state and the financial system.
It seems obviously true that the crisis was multicausal and that the complicity, or rather involvement, was widespread and long term; this, in fact, is one of the themes of "All the Devils are Here." It's fine to beat up Alan Greenspan for his regulatory passivity, both on mortgages and derivatives, and his low-interest-rate policy after the dot-com bust. You can hammer Grenspan, Robert Rubin and Larry Summers for being mean to Brooksley Born on derivatives, or even for the fall of Glass-Steagall. But to a man, these are merely symbolic perpetrators of ambiguous sins. The skein of responsibility for the fall of Glass-Steagall goes back into the '70s. The individual and linked decisions that ran through the decades until the late '90s had compelling rationales, if elements of self-aggrandizement, and more importantly, widespread acceptance. Bank consolidation, with its promise of endless ATMs, innovative financial products and global convenience, hardly stirred a whimper of protest. Everyone loved liquidity and proliferating credit. For all the eagerness to separate Wall Street and Main Street, Americans flocked to invest and, more interesting, to speculate (in some cases, of course, they had no choice, particularly when retirement planning became a personal responsibility). The subprime mortgage problem was less an anomalous example of greed and predation and more a logical extension of the belief in consumers as happy participants in the markets. An extraordinary volume of mortgage problems stemmed from refinancings, that is, folks using their homes as piggy banks. But even that was ambiguous. With real incomes stagnating, with healthcare and college costs rising, with middle-class Americans feeling the squeeze, the refi option was often a rational response.
It's very true: Wall Street also evolved in ways that were increasingly speculative, with the concomitant loss of the ethos of the fair-minded intermediary. But consumers drifted toward that same belief as well; in some cases, particularly in the first flush of Internet trading, many truly believed that consumers should be allowed to play the same rough, take-no-prisoners game of the professionals. Why should only the pros get rich? Everyone should get rich; this is America! The corollary to that belief is that inevitable market downturns are somehow not right, a matter of recklessness, even criminality. Another corollary is that experts, reading the market, should be able to accurately predict. The failure of prediction increasingly came to suggest conflict and fraudulence.
This is not an attempt to condone anything, simply to argue that there was something larger, deeper and more inclusive in what befell us (indeed, the danger of fixating on short-term symptoms, like high pay, lousy risk management or the credit raters, is that you'll miss the underlying causes). You have to revisit the '70s to get a sense of the larger landscape. The '70s saw the transformation of Wall Street from the hidebound practices of a post-Great Depression finance. Reforms were instituted to eliminate a clubby, self-aggrandizing community of poorly capitalized partnerships and replace it, over time, with a Wall Street dominated by competitive, highly capitalized public companies. At the time, this seemed like change that needed to be made (institutional investors, which were rapidly assuming the role as a kind of tribune of the people, pressed for reform), particularly after the near-collapse of Wall Street in the back-office crisis of the late '60s. On top of all this, and driving reforms on Wall Street, was the economic realities of the '70s: It was miserable, arguably as bad as today. The '70s saw the breakdown of New Deal liberalism, after the high-water mark of the Great Society in the '60s. The future increasingly became a matter of entrepreneurs and startups rather than large corporations and managers. The New Deal consensus on Wall Street and banking began to come apart not only under the blows of technological innovation and the first stirrings of a truly open global system, but the belief that the U.S. economy needed a more efficient, more powerful and up-to-date financial engine to effectively compete.
Who could argue? No one enjoyed the bleak malaise of the '70s, despite the fact that many of the economic woes of the period stemmed from OPEC and high oil prices, which fell in the '80s. But with greater perspective, it is clear that the '70s represented a painful adjustment to a larger world finally reviving after World War II. Competition was rising. And while it's obvious that free trade can be a win-win proposition, the increasing competitiveness of countries like Japan, South Korea, Great Britain, France and Germany did carve up chunks of America's industrial economy, particularly in the old industrial Midwest. This, of course, is an old story. True, other parts of the population thrived on the steadily lowering costs (particularly after inflation was quelled in the early '80s), notably technology and the service economy and, increasingly, finance. The importance of higher education grew, setting off the galloping inflation in college costs; the apotheosis of the M.B.A., consulting and investment banking arrived. But the deregulatory impulse that swept Ronald Reagan into power featured an underlying critique that presented America as a powerhouse that only needed to be liberated from government regulation and taxes to thrive again. What made this critique work was an all-powerful marketplace that, in sophisticated and academically approved theory, efficiently allocated capital, wisely set prices, drove out shoddy products or unethical behavior and offered the best view of the future.
Again, these were lessons that the great mass of Americans accepted, for whatever reason. The decline (stagflation in the '70s term; competitiveness in '80s lingo) needed to be arrested. The exceptional entrepreneurial ethos, increasingly embodied in Silicon Valley high technology, needed to be released. That critique took many forms. Companies were freed not only from regulations and unions, but also from taxes and retirement benefits. Tax cuts were viewed as a way to drive liquidity and growth; it's shocking now to revisit tax rates in the boom '50s and '60s. The notion of stakeholders, which implied a balancing of corporate responsibilities, faded into shareholder democracy: one set of owners inextricably tied into the market. On Wall Street, the march of new products -- junk bonds, derivatives, securitization -- began, triggering off the long, complex evolution toward universal banking and greater dependence on leverage and trading. More generally, a free-agent ethos spread: Both employee and employer loyalties eroded, synergistically. M&A, hostile and leveraged, took off. Technology increasingly eroded barriers. The performance culture spread from the go-go mutual funds of the '60s to nearly everything financial, from the rampant CD shopping of the S&L crisis to the use of computers and the Internet by pajama-clad day traders. You were a fool not to play.
For all of that, the American economy felt better in the '80s, but productivity still lagged and de-industrialization still laid large regions to waste. When the cyclical downturn of the late '80s occurred -- foreshadowed by the frightening '87 market crash -- the competitiveness debate, this time fixated on Japan, began again. That debate might have been economically suspect, but it further drove the idea that the American economy needed to be more efficient, which meant more tax cuts, more reductions in the safety net, more free agentry, more dependence on markets. And then there was the confluence of positive trends, most of them unexpected by mainstream economists who saw a large, mature economy facing increasing growth and productivity issues: The end of the Cold War brought not only a defense premium but an acceleration of globalization. Japan faded, Europe was forming around the single currency (setting off booms in Ireland, Spain and Eastern Europe), and China, India and Brazil were slowly emerging as economic powers. American technology held sway; and that technology clearly stimulated the sudden rise in productivity in the '90s. Good times. In retrospect, bubble times.
We had come a long way since the '50s. A variety of ideas and trends were tightly interwoven in the economy that rumbled into the 21st century. America was the world's only true superpower. It had the dollar, the world's reserve currency, and military and commercial (and sometimes humanitarian) interests on every continent: an empire. America featured a powerful, and powerfully attractive, consumer culture fed by vast amounts of credit. Indeed, it was a consumer wonderland, amped up even more so by the new technologies, notably computing and the Internet that extended every buyer's reach. Trade deficits and a declining industrial base seemed to be reasonable prices to pay for such a bountiful consumer economy. Woven through all that was the bright thread of American exceptionalism. Democracy, free markets, individualism, the Internet, a consumer marketplace designed to boost desire and provide material choice (and, of course, its very American mirror image: evangelical religiosity) represented the best of all possible worlds. Everyone wanted to come to America. Homeownership (or universal broadband, Wi-Fi or stardom) became the embodiment of the American dream. Most importantly, there was the clear identity made between markets and politics. States like California became increasingly "democratic" with regular referendums. But economic thinking -- increasingly ossified into the efficient-market hypothesis -- became more and more dogmatically central. What Americans wanted from Washington was less equity and social justice, more growth, jobs, higher income. In some sense, they had no choice: Responsibility for their own financial welfare had been turned back upon them. They were running just to keep up. The role of American citizens as shareholders grew. Economists were tiny gods, topped by the Great God, Greenspan. "It's the economy stupid," as the Clinton campaign famously said.
The Sept. 11 attacks only tightened the knot of these notions further. Most of these ideas and trends were complacently viewed as further signs of America's genius. But they produced many unintended consequences, some toxic, like subprime mortgages, an overleveraged Wall Street and a middle class submerging in credit. And then it all blew up.
What does this long and winding travelogue tell us? The roots of the crisis go deeply into the past and cover a lot of ground. You can't just yank a few of them out and expect much to change. In so many ways, it was a systemic crisis, which requires systemic solutions. Consider just the effects of globalization, recurrent waves of American insecurity (which really appear to be related to recurrent bubbles) and the demands of growth. How many of these economic and financial policies were justified with some version of the competitiveness argument and its corollary, efficiency? How much of this was real, how much spin and lobbying? While it's certainly clear that the reality of decline was less serious than advertised (by both left and right), it's also true that the globe has been slowly evolving from one dominated by the U.S. to one featuring many poles of economic dynamism. Now free trade theorists will argue that everyone can profit from a booming world economy, but they leave out two things: sometimes entire regions like the industrial Midwest are left behind; and the psychological investment made by generations of Americans in the notion of American supremacy. One way to mask such relative and inevitable "decline" is to ply the populace with credit, new toys and various versions of the American dream.
Well, we are who we are. This has always been a fervently commercial nation and one accustomed to boom and bust (though one that had also been convinced that such instability was a relic of the past). Today, Americans are slowly reducing their debt and worrying about deficits. Regulators are awake and active, although at this point no fundamental transformation of finance, a la the '30s and '70s, appears to be in the works -- a reality quietly pointed out by David Skeel's examination of the Dodd-Frank financial reform bill, "The New Financial Deal," particularly in his discussion of the corporatist partnership between banking and the state. Will the good behavior last? For a while, but not forever. Banks are public companies and must serve shareholders with earnings per share growth. And consumers? We're already high-fiving a robust Christmas season.
In the New York Times recently, Paul Krugman bewailed the fact that although the ideas of "free-market fundamentalism" had so clearly failed, that they seemed to be triumphant in Washington. Krugman has a point, but it's broader than simply that of good ideas and bad. Zombie economic ideas, like supply side or competitiveness, have long held sway, as Krugman himself pointed out several decades ago; but he does not deal with the wreckage of earlier notions of big government and a dominant big business that ran aground in the '70s. The economic crisis has further chipped away at all economic thinking. But the zombie metaphor has a certain validity. It's as if we live in an era where all the gods have died. Suddenly the world is a bewildering place, without the usual touchstones, rituals and sacred texts. We cast ourselves back to a golden age; we seek out villains, symbolic and real; but mostly we go day-by-day pretending the fragments are whole, while waiting for ideas that both work and further dispel our anxieties.
Interesting times. Happy holidays and may the new year be prosperous -- though let's not get carried away. - Robert Teitelman Share: blog comments powered by Disqus