The Region

Interview with Michael Woodford

Columbia University economist on Fed mandates, effective forward guidance and cognitive limits in human decision making

Douglas Clement | Editor, The Region

Published September 16, 2014  | September 2014 issue

Interview conducted July 23, 2014

Michael Woodford

Though pundits suggested otherwise, there was no straight-line causality from Michael Woodford’s presentation at the Fed’s August 2012 Jackson Hole conference to the FOMC’s December 2012 adoption of inflation and unemployment thresholds. While both involved “forward guidance” and stressed clear communication about a credible policy path, the timing was doubtless coincidental.

But there is also little question that Fed leaders were already well-steeped in Woodford theory, and quite familiar with the arguments he made in August. For nearly two decades, the New Keynesian model*—of which Woodford is a leading architect—has been a key framework for academic research in monetary economics, and bedrock for research and policymaking at central banks worldwide.

With this framework, Woodford and his co-authors have explored and explained the mechanisms by which monetary policy affects employment and production, as well as interest rates and prices, and because his work has such practical utility and intellectual power, the way policymakers think about policy—and arguably, design it—has shifted fundamentally. His insights into policymaking when nominal interest rates can go no lower have been particularly useful.

Woodford’s 2003 Interest and Prices—called a “bible for central banks” by some economists—discussed these ideas at length. “Immensely influential,” said Princeton economist Lars E. O. Svensson of the book, in awarding the 2007 Deutsche Bank Prize to Woodford for establishing “foundations for … models now being developed by the most advanced central banks [and] also providing central bankers with a practical framework [for thinking about] monetary policy, in particular the fundamental role of expectations and transparency.”

The Deutsche Bank award is one of many Woodford has received. While still a graduate student at MIT, he was selected by the MacArthur Foundation for its inaugural class of “geniuses” in 1981. He’s been recognized with fellowships from the Guggenheim Foundation, Econometric Society and American Academy of Arts and Sciences, and awards from numerous other institutions.

Woodford’s intellectual interests are unusually broad. He went to the University of Chicago initially to study physics, then majored in cognitive science, got a law degree at Yale and later chose economics—drawn by both its theoretical rigor and concrete application. “Central banking,” he observes, “is one of the human activities where I think there is some real use to relatively abstract theoretical contributions.”

* Developed in response to the potent 1970s rational expectations/flexible prices critique of then-dominant Keynesian theory and policy, the New Keynesian model accepted some of the critique but argued that rigidities in pricing caused markets to adjust slowly and could result in undesirable fluctuations in employment and production. Stimulative fiscal and monetary policy—if well-designed and implemented—could therefore be effective in counteracting economic downturns.

EFFECTIVE MONETARY POLICY

Region: I’d like to start with some questions about policy, in particular, forward guidance. In August 2012 at the Fed’s Jackson Hole symposium, you gave a very influential speech in which you compared two options for monetary policy when at the zero interest rate bound: forward guidance and quantitative easing (balance sheet) policies.

You argued that essentially both theory and data suggest that forward guidance is likely to be the more effective of the two, and you further recommended that policymakers should make “advance commitment to definite criteria for future policy decisions.”

Four months later, at the December Federal Open Market Committee meeting, the Fed did adopt forward guidance—in the form of thresholds for unemployment and inflation—along with continued quantitative easing. Did that approach meet the standards you would advocate in terms of definite criteria?

Woodford: It was certainly a step in that direction. Not only was it an attempt to shape expectations by making official statements about future policy, but it was in line with what I had been arguing for in at least one important respect, which is that it was saying something about criteria for making a future decision as opposed to trying to announce the future policy settings themselves in advance.

The Fed had already been using statements about future policy as an important part of its efforts to stimulate the economy, particularly dramatically since the previous summer, when it had begun making quite unprecedented statements about specific dates, as far as two years in the future, until which the FOMC anticipated being able to maintain its current unusually accommodative policy. But that approach didn’t involve stating criteria for making a future decision; instead, it only offered a guess about where the federal funds rate would be at specific future dates.

There are various reasons why I think such “date-based guidance” is a less satisfactory way to try to shape expectations about future policy. The most important problem is that it’s unlikely that a central bank would really be making a promise or declaring an intention about future policy and make it in this very specific form of saying where the instrument will be two years in the future.

And, of course, the FOMC wasn’t really making such a promise. If you looked at the fine print of what they said, they hadn’t said we intend to do this. They hadn’t said we will do this. They had said we currently anticipate that future conditions will warrant our doing it.

Region: The wording was very indefinite, vague: “anticipating conditions,” but not saying what might happen if those conditions aren’t fulfilled.

Woodford: Yes, and not only that: It didn’t even say what kind of future conditions those had to be, only that we currently are anticipating that there will be such conditions, but we don’t have to say what kind of conditions those would be. So whether the conditions are shaping up or not, you don’t really know. Even as you see the news coming in, you wouldn’t really know whether it is or is not developing into the conditions that would warrant the policy. You only know that at some date in the past, the FOMC was anticipating some unstated conditions that it thought for unstated reasons would warrant a particular policy.

That was, I think, an important qualification, although it’s not surprising that it had to be so hedged given that the idea that they would really promise two years in advance exactly where the federal funds rate would be—well, that would be a pretty shocking thing to do, if it were an actual promise.

Region: So, then, in December 2012, the Fed moved ahead to explicit unemployment and inflation threshold figures for policy change.

Woodford: That’s right. This meant trying to say something about specific criteria you would be looking for, which may or not arise by a certain date, and that should determine whether you are or are not thinking about particular policy changes by that date.

I think that is much more sensible as a way of trying to make a statement about future policy because it is something that you could reasonably say and mean as a statement of intention. And I think it was intended as a statement of the form: “We intend to actually conduct our future deliberations along certain guidelines that we’re announcing in advance.” From that point of view, it was very much what I was calling for.

Now, the specific form of the statement they made was not quite what I had suggested in my lecture, and not really what I would have preferred. But, of course, they had to announce a policy that they thought they could follow.

Region: At Jackson Hole, I believe, you said that nominal GDP target policies were more consistent with what you had in mind.

Woodford: Yes. I had specifically suggested that announcing a target path for nominal GDP would be a desirable way to make an advance statement about the criteria that you would be looking at later.

Now, I wasn’t saying that to suggest that that’s the only formula that would be valuable, but I thought it was useful to give a concrete example showing how the thing that I was talking about could be undertaken in practice. It was a simple proposal that nonetheless incorporated the key elements of what I thought was a desirable form of commitment. I also thought it could be understood by a fairly broad public. It incorporated what I thought were key considerations that people on the FOMC were likely to be concerned about, although it turned out that evidently it didn’t address their concerns as much as I was trying to, because it didn’t get much traction with them.

Region: You mention commitment here. But if it were faced with a potential scenario of inflation exceeding 2 percent and unemployment low, the FOMC might want to deviate from the nominal GDP target and raise interest rates. Would you be concerned about time inconsistency issues with a nominal GDP target strategy?

Woodford: Any strategy that seeks to obtain benefits now from giving people a reason to expect something later raises the question of time consistency of the policy: The mere fact that you wanted people to expect something earlier may not count as a reason for you to want to actually deliver it later. This issue arises in public policy all the time, as Finn Kydland and Ed Prescott explained in their famous paper.

And the way that we deal with this tension is not, or at least not always, to say that an honest government will never make any promises to do something other than what it should later want to do in any event. For example, we promise not to expropriate people’s property, in order to give them an incentive to make productive investments, and in general this commitment—and a common understanding of why the ability of people to rely on it is important—does provide a substantial check on the temptations to seize property that might otherwise arise. But for that to work, it’s important, at the very least, that there be a fairly clear understanding of what the commitment means; and it may also be necessary that enough people can understand the basic logic of what the commitment was intended to achieve, so that it isn’t viewed ex post as simply a rash mistake that one should hope to be excused from.

A commitment to a nominal GDP target path would raise this issue, but no more than any other form of meaningful forward guidance does and, indeed, no more than does the announcement of an inflation target, as the FOMC made in January 2012.

Apparently some on the Committee are more comfortable with the idea of having to tighten policy to keep inflation from running too high—simply to validate the expectations of low inflation that you had sought to create earlier—than they are with the idea of allowing inflation that might be temporarily above the long-run target rate, simply to validate expectations that nominal GDP would be allowed to catch up to a previously announced target path.

But at a conceptual level, the issue is no different. Probably the reason they are more comfortable with the idea of disciplining their policy decisions through an announced long-run inflation target is that the potential benefits of such a target have been discussed at greater length. It took the FOMC 35 years to catch up with the scholarly literature on that proposal, after all.

Region: Roughly a year later at a Riksbank presentation, you compared what the Fed had done, the threshold approach, with what you had advocated. What are the advantages of the latter, of a nominal GDP target policy?

Woodford: One advantage is it would be a single criterion; whereas, the thresholds that the Fed announced were two different criteria.

Region: Perhaps dueling criteria, at times.

Woodford: Right. There was a threshold for the unemployment rate, but there was also a threshold for inflation expectations. The question of whether those could be in conflict was being sidestepped. I think they were hoping things would evolve in a way that no tension between the two criteria arose, and that turned out to be right, but it was a gamble. If, say, it had begun to appear that the inflation threshold could be breached before we were anywhere close to the unemployment threshold, there would have been a lot of uncertainty about how policy might develop.

A nominal GDP target path would have the advantage of being a single criterion, yet one that conveyed concern both about the real economy and about the price level and nominal variables at the same time. It would have given an explanation for which substantial stimulus would have continued to be appropriate for some time to come. But it was also a criterion that was intended to reassure people that what looked like very aggressive monetary policy was not going to allow inflation to get out of hand. If inflation picked up very much, the FOMC would quickly have reached the nominal GDP target and then would have to restrain nominal demand growth in order not to shoot past the target path. The public wouldn’t have to be worried that we were pushing so hard on stimulating the economy that maybe we were going to let demand get totally out of control, and we were just not thinking about that because it wasn’t the fire that had to be put out this year.

SHIFTING FROM NUMERICAL THRESHOLDS

Region: Earlier this year, the Fed modified its forward guidance; it relaxed its reliance on numerical criteria and moved toward a qualitative form of forward guidance. What are your thoughts about the wisdom of this new approach?

Michael WoodfordWoodford: I was not surprised that the FOMC had to change its approach. The unemployment threshold was about to be reached, so it was not providing much guidance about policy in the future. Yet the FOMC wasn’t at all inclined to immediately revert to something that would look like precrisis policy, either. The fact that the thresholds ceased to provide useful guidance long before it was time for policy to be “normalized” was, in my view, another of the weaknesses of that strategy.

But given that they had adopted it, it was then difficult to switch to some other form of relatively explicit criterion for what actually would determine when it was time to normalize policy. I agree that it wouldn’t have made sense to announce a new, but lower, unemployment threshold once the old one was reached, and I agree that they shouldn’t have felt that the previously announced threshold required them to immediately begin tightening policy. And it would have been hard to switch to a conceptually very different approach to forward guidance, such as a nominal GDP target date, at that late date as well. So they were left with little alternative but to revert to a much vaguer way of talking about policy intentions.

It doesn’t seem to me that this vaguer approach to communication was really forced by the complexity of the situation that had arisen. Of course, the situation is complex, but it had not become a lot more complex than it been a year and a half earlier. I think it’s more that the choice of the threshold formulation in 2012 then made it hard to adopt a better approach when we reached 2014.

OTHER POLICY TOOLS

Region: It’s clear that the Fed is tapering and is beginning to experiment with other policy tools beyond quantitative easing and the fed funds rate, mechanisms such as reverse repurchase agreements. What are your thoughts about the potential effectiveness of such tools and the feasibility of implementing them?

Woodford: I am not worried that the Fed is not going to have effective tools for implementing its interest rate policies. We have yet to reach the point where they do want to raise interest rates, but assuming that things evolve as everyone is currently anticipating, we are likely to reach it within the coming year. At that point, I think, there will be tools that allow them to do it.

It will be an interesting experiment in monetary economics because the Fed will be attempting to control short-term interest rates in a situation where almost certainly its balance sheet is going to be unusually large. That means that there are going to be extraordinary quantities of excess reserves in existence, and this means that Fed control of short-term interest rates will not be achievable in the way that it always was in the past: through rationing the supply of reserves. The Fed would maintain a fairly small supply of reserves, small enough that there was indeed an opportunity cost of reserves, and it could adjust that opportunity cost fairly precisely through relatively small changes in the supply of reserves.

That won’t be the case when we begin tightening policy this time, but I think there are other tools that should be effective. And as you pointed out, they’ve been actively experimenting with the development of additional tools, just to make sure that there are enough ways to control money market interest rates.

Region: Are there any mechanisms that you think are particularly potent?

Woodford: Well, I think the fact that interest rates can be and are currently being paid on excess reserves is very important. Of course, the Fed asked for that authority from Congress back in 2008 before embarking on the large expansion in the size of its balance sheet. The reason, I think, is that it was preparing for this question that we are going to face within the next year or so: When you have this big balance sheet, have you given up control over short-term interest rates? The FOMC wanted to be sure the answer to that question was “no,” and it could do that by having the ability to pay whatever interest rate it deemed appropriate on those reserves. So that’s a very important tool, and probably the most important tool that they are going to have when the moment arises.

But you mentioned the introduction of the reverse repo facility, and I think it should also be very useful to have that tool as well. In particular, that should help to address a worry that some people have, who point out that we’re paying 25 basis points of interest on reserves right now, without this placing a floor on the federal funds rate or overnight rates in general. You then might conclude that paying interest on reserves isn’t an effective way of controlling other short-term interest rates.

My view is that it’s hard for those other interest rates to trade too far below the interest rate being paid on reserves. So I think you should be able to pull them up by increasing the interest on reserves. But if you’re worried that you could raise the interest rate on reserves substantially and it wouldn’t pull up those other money market interest rates, then having the reverse repo facility to also push them up, by offering the opportunity to get a certain overnight interest rate through transactions with the Fed, is something that ought to allay that concern.

Region: So the reverse repo facility is a backstop, in your mind, a secondary mechanism that should provide some assurance to markets?

Woodford: My guess is that even without that they would have a pretty good degree of control over overnight interest rates. But I think having the reverse repo facility makes it even more certain that if they want to raise the level of overnight interest rates by, say, 50 basis points or a percentage point, that they can do that, and should even be able to do it with a fair amount of precision.

I think there was more reason to worry about whether the Fed had enough tools with which to influence financial conditions when the problem was finding more ways to loosen conditions. Once the problem becomes one of finding ways to tighten financial conditions, we’ll be facing a more familiar problem, and I think there will be ways to do it.

Michael Woodford

CONGRESSIONAL MANDATES

Region: Let me shift considerably and talk about congressional mandates.

Since 1977, the Fed has had the dual mandate—to promote price stability and maximum employment. But since the financial crisis if not before, there has been ongoing discussion primarily about whether to jettison the employment part of the mandate, so that the Fed’s focus would be strictly on maintaining price stability.

More recently, others—such as former Fed Vice Chair Donald Kohn—have suggested adding a third mandate regarding financial stability. Earlier this month, at the National Bureau of Economic Research Summer Institute, the Fed’s Vice Chair Stanley Fischer said that Kohn’s proposal “clearly warrants serious examination.”

What are your thoughts? Should maximum employment be removed from the Fed’s mandate, and would adding financial stability to the mandate be valuable?

Woodford: I’m very surprised by the proposal to eliminate the real economy side of the dual mandate. You could argue that the particular language, “maximum employment,” may not be the most precise description of the objective. But the idea that you would simply have a price stability mandate and no reference to the real economy at all, I find surprising, particularly after the experience of the past five years. Clearly, the overriding concern of policy over this period has been the state of the real economy and, indeed, the labor market, rather than inflation; in my view, that concern with the real economy has been more justified on this occasion than in many decades; and the Fed hasn’t had to sacrifice price stability in order to help support the real economy. That anyone would choose at this particular moment to propose that it would be better to force the Fed to focus solely on inflation boggles the mind.

In fact, I think that if the Fed’s legislative mandate excluded any concern for the labor market or economic activity, that would have been a straitjacket that would have been pretty unfortunate in the situation that we were just in.

Region: And financial stability?

Woodford: The question whether there should also be a financial stability mandate is a more reasonable one to take up. Though I have to say that I find it a little surprising that people would think that there isn’t one. It’s true that the Federal Reserve Act mentions price stability, it mentions maximum employment and it doesn’t, in a similarly direct way, talk about the responsibility for financial stability.

But, historically, if we ask why the Federal Reserve Act was passed at all, we know that Congress established the Fed in response to a financial crisis. From the legislative history, it’s clear that the whole point of the Federal Reserve Act was to have an institution that would act to ensure financial stability.

It’s true that when the current language of the Federal Reserve Act was drafted in the 1970s, financial stability had become a less central concern, and instead inflation and unemployment were both big problems. Still, the idea that anyone would have thought that it was somehow not the Fed’s concern is strange. I find it hard to imagine that if the Fed thinks it should do something out of a concern for financial stability, anyone would actually be able to object that this was overstepping the bounds of what Congress ever wanted it to be concerned with.

Region: But, of course, many people aren’t familiar with the Fed’s history. Adding financial stability to the mandate would make that responsibility—perhaps assumed by many—more explicit.

Woodford: That’s right, and I don’t see anything wrong with making it more explicit. It’s just that it seems to me that an amendment of the act to do this would be fixing something that isn’t really a problem.

There are, of course, important questions about the extent to which financial stability considerations should be taken into account in making monetary policy decisions, particularly when one is not already in the midst of, or on the cusp of, a serious financial crisis. But these are prudential questions—do you really know how to do it, and how might it interfere with your other goals to even try?—rather than questions about the legitimacy of the concern.

STRUCTURE AND COMMUNICATION

Region: Let me ask about the Fed’s structure, which again was set years ago. You’ve always been a powerful advocate for clarity, communication and transparency with the public—that that’s really essential to the effectiveness of Fed policy. The FOMC has become more transparent over the past 20 years, but the structure has not changed dramatically.

Do you think that the structure—with both regional presidents and the central board—tends to strengthen or obscure policy clarity and communication? Put otherwise, what are the trade-offs of a structure that has geographic representation that provides valuable input from around the country, but also may lead to policy confusion because many Fed presidents are giving speeches and making statements, versus the Fed speaking with just one voice, presumably the Chair’s?

Michael Woodford

Woodford: I think that it does definitely create problems for the transparency and clarity of communications about policy to have the kind of decentralized structure that the Federal Reserve System has. That doesn’t mean that there aren’t also advantages to it.

The obvious advantage—the reason for setting it up that way—was to have different parts of the country be represented, particularly in light of the fact that different sectors and industries are important in different parts of the economy. And I think that’s obviously valuable.

I think you could also argue that a decentralized structure is good from the point of view of having checks and balances, in the sense that “groupthink” is more easily avoided. You have independent staffs producing their own independent analyses, and then you can confront them with each other. There are advantages in having different points of view contend and seeing who ends up winning the argument. That’s another thing that’s valuable about the decentralized structure. The issue that we spoke about earlier—the FOMC’s adoption of a new, more state-contingent approach to forward guidance—had a lot to do with advocacy from some of the regional Bank presidents, including your own Bank’s, and I would call that an example of the decentralized structure working well.

But there is at least one important problem that the decentralized structure creates: Speaking clearly with one voice is a lot harder. It’s not just that a single decision maker would allow the institution to have just one voice. Even if it was a committee, if its members were all together in Washington, I think it would be a lot easier to hash things out and come to agree on what the committee has chosen as its position. And then even if multiple people were to give speeches on different occasions, I think it would be easier for them all to be conveying the same message.

When the Fed’s regional Bank presidents are in different parts of the country most of the time and only meet very briefly, it’s probably harder to have the kind of extensive ongoing discussion that would be needed to really get on the same page. And I think that that is a problem.

It wouldn’t be so much of a problem if you thought that the only decision the FOMC has to make is setting a number for the federal funds rate or something like that. And that once that number is decided, everyone can say, “OK, now that we’ve decided on the funds rate, the meeting is over, we go home and it’s going to be implemented.” If you thought that’s all there was to policy, then it wouldn’t really matter that people might have different points of view on why exactly they did or didn’t move the rate more in a given meeting; and so brief, infrequent meetings might well be enough—enough to compromise on a number that in any event only applies until the next meeting, even if it is not enough to come to a common view about the strategy behind the decisions.

But it’s clear that as a practical issue, it’s becoming increasingly important what the institution communicates to the public about where policy is heading, what the thinking is behind that policy and what the criteria are that are likely to be shaping future decisions. It’s much harder to communicate a clear view on those kinds of things without the members of the committee having an opportunity to talk to each other at more length than I’m afraid it’s easy for them to do in the existing geographically decentralized structure.

And so I think there is a problem. Whether this means that the actual structure of the Federal Reserve—or who has the voting rights—really needs to be changed, or whether they can simply organize the decision process so that the different parts of the System communicate more with each other, I’m not sure. But I would urge that it ought to be recognized as an important problem for the current organization of the system. More thought should be given to ways to increase the extent to which there is a robust exchange of views about how to best think about what the policy framework is and how it should be communicated to the outside world.

DECISION MAKING

Region: It hasn’t been your primary agenda over the years, perhaps, but you’ve devoted significant effort to understanding how humans make decisions, incorporating insights from behavioral scientists like Daniel Kahneman and Amos Tversky and neuroscientists such as Paul Glimcher.

In a paper delivered at the 2014 American Economic Association meeting in January, you suggest that cognitive limits have a fundamental role in shaping how we humans make economic decisions. And the model you describe—which hinges on constraints on the information-processing capacity of neural pathways—does a better job of fitting experimental data than did certain competing models.

What implications does that have for micro- and macroeconomic research? And how does this work fit in with your primary research focus on monetary policy?

Michael Woodford

Woodford: Well, it’s something that I have come to pursue deeply, and at least originally this was because of my interest in understanding the foundations of macroeconomics.

Region: These are the true micro foundations.

Woodford: That’s right. And the reason they are needed is because a key issue for macroeconomics, and in particular for understanding why monetary policy matters, is to understand why adjustments to changing market conditions don’t occur more smoothly and more immediately.

It’s been a very long-standing observation by economic theorists that in principle the level of wages and prices in terms of a monetary unit shouldn’t have any effect on the real economy. It ought to be only the relative prices of things that affect supply and demand decisions, and so changes in the value of the monetary unit shouldn’t in principle have to have any effect on the real pattern of transactions in the economy.

Region: But, of course, they do.

Woodford: Yes. It seems that they do! So that’s a central question for macroeconomics, and particularly for understanding why monetary policy matters. In what one can probably call the mainstream approach to this question—and certainly the one that I’ve used in a lot of my own modeling—the way that we try to think about that is by supposing that, for some reason, decisions aren’t being constantly made, and so prices are not constantly being reoptimized. Then one can look at the consequences for equilibrium and how it adjusts over time under that assumption.

A lot of my work has been trying to develop general equilibrium frameworks in which prices that are not being constantly adjusted are incorporated into the model. Then you can get real effects of monetary policy in those models and understand how equilibrium should be different with different types of policy rules.

But a central question for that kind of modeling is: Why exactly is there not more immediate adjustment of wages and prices when market conditions change? Moreover, there are reasons to worry that the answer to this question about the underlying source of the adjustment delays might actually be important for the conclusions that you get out of the model.

The mainstream approach of the literature of the last few decades has assumed that for some reason—often not too explicitly modeled—decisions about, say, wages or prices are not going to be constantly adjusted. But the models are still set up on the assumption that all of the decision makers are perfectly aware of what market conditions are and what would be currently optimal for them at every point in time, even though for some reason it would be costly to adjust, say, their prices or their wages more frequently.

We understand a fair amount about the logic of models like that. In some ways, they were only a small step away from the kinds of intertemporal equilibrium models that we understood how to work with already, and so I think that’s why we explored that path first. We did something simple that was not too different from the models we already understood well, so we could understand what we were doing.

But there are important reasons to be worried about whether the model has gotten everything right. I think it’s right to suppose that things aren’t constantly readjusted optimally, but it may be wrong to think that people are perfectly aware at every moment of what it would be in their best interest to do, if only they were not subject to “menu costs” or some other barrier to more constant adjustment. Rather, the failure to adjust probably has to do with failures of knowledge to be quite that precise in quite such a timely way. If you ask what the costs of more frequent price adjustment really are, I suspect they are costs of having to pay more close attention and make more precise decisions all the time about what exactly it’s best to be doing.

Now it is possible that if one understood the nature of those constraints better, it would turn out that everything happens just as if everyone had perfectly precise awareness, but something was constraining them from moving certain variables more often, and while otherwise everything happens as if they’re perfectly aware of what they’re doing. But this may not turn out to be the case. Understanding what the cognitive limitations are, and how theyare responsible for adjustments not occurring as rapidly, may have important consequences for understanding the nature of those adjustment processes and, in particular, for understanding how policy shapes and influences those adjustment processes and interacts with them.

DIFFERENT MODELS

Region: Earlier, when talking about structure of the FOMC, you mentioned that one of the advantages of a decentralized structure is letting different theories compete and seeing which wins the day.

The “New Keynesian model,” of which you’re a primary architect, is now an essential ingredient in policy research, analysis and policymaking at central banks all over the world. That must give you a true measure of pride and satisfaction.

But on the other hand, do you think there’s a risk that it could be “crowding out” other potentially useful paradigms? Are there other, competing models that hold promise?

Woodford: Well, I certainly don’t think that our understanding in macroeconomics is already completed, so I would never argue that because we have a good theory we should therefore not try to do any further research, or even explore fairly different approaches. As I said, my own work is indeed still pushing on asking whether the foundations can be further improved, without presuming whether an improved model will necessarily deliver something similar to our current understanding, or whether it might result in more far-reaching changes.

It’s true that many are using New Keynesian models, and I think they are helpful for understanding some issues. Indeed, some of the central issues raised in the monetary policy debates of the past few years, such as the potential advantages of different forms of guidance about future interest rate policy, would have been difficult to think systematically about without these models—not because they are perfect models, but because what we had before was even less suitable for that purpose. But that’s not a claim that some other idea, yet to be developed, won’t turn out to add something important to the New Keynesian framework, or even show that one could dispense with important parts of it because there’s actually a better approach.

Region: Are there other potentially promising models? And is there a tendency in the way that economic research is carried out, in central banks and academia, that those ideas are being crowded out or dismissed, essentially discredited before they’re given due consideration?

Woodford: Well, I hope that it’s not true that promising ideas are discredited too soon. And I have to say I have trouble imagining how anyone could think of the New Keynesian framework as somehow having become hegemonic—maybe in central banks it looks like everyone uses that framework, but in academia and in the research journals, the opposite is true; not only can alternative approaches be explored, but they are most of what gets taught and published.

As to whether it has become a dominant approach among central bank research staff, I would hardly say that it’s the only thing that you see. But it’s probably true that in central banks, there is something useful about having a systematic framework that you can use to think about a whole series of problems that are going to come up, using a coherent framework and language that aren’t rethought from the beginning with every new issue that arises.

The job of central bank staff is to respond to rapidly changing situations in the economy and to be able to brief policymakers on how to think about what’s happening. So the balance between time spent using an accepted framework to address applied questions and time spent on more fundamental inquiries into whether a better framework might be possible is somewhat different than in academia.

Also, as we were saying earlier, central banks need to be able to communicate, and communicate in a relatively clear way, about what their approach to what they do is, and from that point of view there will be an advantage to adopting a particular framework, simply in order to be able to send a clear message. That doesn’t mean you should never consider changing it, but if you’re constantly talking about how everything is up in the air and you’re considering many different possible approaches, that makes it harder for people outside to guess what you might or might not be doing next year.

But even in central banks, I don’t think it’s true that there is no room for alternative approaches, even fairly speculative ones, to be pursued. The development of my own ideas about the foundations of monetary economics would not likely have taken the direction they did without many fruitful discussions with researchers in central banks, at a time when the work was far from the academic mainstream. This included not only discussions with people in the Federal Reserve System, but some very crucial discussions with people at banks like the Bank of Canada, the Reserve Bank of New Zealand and the central banks of Sweden and Norway.

POLICYMAKER?

Region: That leads to my last question, about whether you’ve considered a more formal policy position. You’ve been a valued adviser to many central banks, but as far as I’m aware, you’ve never had an official policymaking role.

Woodford: That’s right.

Region: Have you ever considered it? Do you ever think that your research might have greater impact if you were in a position to implement it? Certainly there have been—are currently, for that matter—prominent central bankers who come from academia. Perhaps this relates to a broader question: Why is monetary economics so elegant in theory, but difficult in practice?

Woodford: Well, I don’t think it’s a mystery why it’s difficult in practice. The world is complicated. I find it more surprising that there is as much use for theoretical clarification as there is, especially compared to a lot of other areas of practical activity. Most practical problems are complicated, and they’re dealt with by practical people who use some ideas about what they’re doing, but don’t use much high-powered theory.

Central banking is instead one of the human activities where I think there is some real use to relatively abstract theoretical contributions. That’s because while it is certainly still a complicated issue, it also raises conceptual issues that are of such a nature that it is actually helpful to have some theory at your disposal in thinking about how to deal with them.

That’s partly because of the abstract nature of the problem, and it’s partly because of the thing we’ve already been talking about, which is that not just making a judgment but communicating about how you make it also matters. That second reason means that even if gut instincts obtained from experience that haven’t been conceptualized theoretically could get you to make the right guesses about what to do at a given moment, that really wouldn’t be enough to completely solve the problem of the modern central banker, because I think they now also need to be able to talk about what their institutions are doing. That means that there is a role for people who can conceptualize the problem, if only to be able to improve that communication process.

Region: Given that, have you always viewed your role as a theoretician, rather than policymaker, and if so why?

Woodford: That’s a question of the division of labor in a complex society, and part of why a society like ours is as productive as it is, is because many people play different roles, and we try to get them slotted into the ones that they’re better at.

In smaller, less complicated societies, anyone who is an economist probably should be involved in government and teaching at the same time, because there aren’t as many people to assign to different roles. But in a place like the United States, there are many people to play many different roles.

And my guess is that I’m better at thinking about more long-range issues and underlying conceptual problems, and that other people are better at thinking about what to decide this month and how to give the press conference this week, and those kinds of things, that require you to be on the ball and adjust quickly to short-range changes in situations. Different people are good at different roles.

MORE ABOUT MICHAEL WOODFORD

Current Position

John Bates Clark Professor of Political Economy, Columbia University, since 2004

Previous Positions

Harold H. Helm ’20 Professor of Economics and Banking, Princeton University, 1998-2004; Professor of Economics, 1995-98

Professor, Department of Economics, University of Chicago, 1992-95; Associate Professor, 1989-92; Assistant Professor of Business Economics, Graduate School of Business, 1986-89

Assistant Professor, Department of Economics, Columbia University, 1984-86

Professional Affiliations

Scientific Adviser, Sveriges Riksbank, since 2012

Economic Advisory Panel, Federal Reserve Bank of New York, since 2009; Monetary Policy Panel, since 2004

Research Fellow, Program in International Macroeconomics, Centre for Economic Policy Research, since 2004

Research Associate, Programs in Economic Fluctuations and Growth and in Monetary Economics, National Bureau of Economic Research, since 1994

Series Co-Editor, Handbooks of Economics, Elsevier Press, since 2013

Node Leader, International Network on Expectational Coordination, since 2012

Editorial Board, Annual Review of Economics, since 2010; American Economic Journal: Macroeconomics, since 2007

Advisory Board, International Journal of Central Banking, since 2008

Frank P. Ramsey Prize Committee, Macroeconomic Dynamics, since 2000; Advisory Editor, since 1996

Honors and Awards

Fellow, American Academy of Arts and Sciences, since 2004

Fellow, Econometric Society, since 1991

Named one of Bloomberg Markets’ 50 Most Influential Thinkers, 2013

Deutsche Bank Prize in Financial Economics, 2007

Association of American Publishers 2003 Award for Best Professional/Scholarly Book in Economics, for Interest and Prices: Foundations of a Theory of Monetary Policy, Princeton University Press

John Simon Guggenheim Memorial Foundation Fellowship, 1998-99

John D. and Catherine T. MacArthur Foundation Prize Fellowship, 1981-86

Publications

Author, Interest and Prices: Foundations of a Theory of Monetary Policy, 2003; editor (with Benjamin M. Friedman), Handbook of Monetary Economics, vols. 3A-3B, 2011; editor (with Ben Bernanke), The Inflation Targeting Debate, 2005; editor (with John B. Taylor), Handbook of Macroeconomics, vols. 1A-1C, 1999; prolific author of research articles on macroeconomic theory and monetary policy

Education

Massachusetts Institute of Technology, Ph.D., economics, 1983

Yale Law School, J.D., 1980

University of Chicago, A.B., 1977

For further background, visit Woodford's web page.

Top