Skip to main content

Interview with Lars Peter Hansen

Nobel laureate on the importance of uncertainty, “systemic risk” as a catch phrase, and links between asset prices and the macroeconomy

December 17, 2015

Author

Douglas Clement Managing Editor (former)
Interview with Lars Peter Hansen
Lars Peter Hansen
Lars Peter HansenPhoto by Peter Tenzer

In March 1979, Lars Peter Hansen, a young Ph.D. fresh out of the University of Minnesota, submitted a paper to the prestigious Econometrica. It described a statistical methodology that, in its final form, would allow economists to draw strong conclusions from models that weren’t completely specified (that is, not all variables, relationships or assumptions were included or precisely defined).

This “generalized method of moments” would give econometricians the ability to appraise alternative theories and investigate important economic phenomena without fully developing each of their elements. Researchers could rely on the most powerful explanatory variables and dispense with unnecessary assumptions. “GMM allows you to ‘do something without having to do everything simultaneously,’” Hansen explains.

But the GMM—abstract and mathematically challenging—was not immediately embraced by the field. (Indeed, Hansen’s initial draft was rejected by Econometrica, spurring him to refine and generalize his argument.) Hansen and his colleagues persevered, demonstrating the methodology’s power and range by applying it to exchange rates, asset pricing models and rational expectations theory. These and other examples gradually convinced economists of its utility and, with time, GMM became the gold standard. In 2013, Hansen received the Nobel Prize in economic sciences for his methodology, specifically in reference to its ability to evaluate asset pricing models.

Hansen continues to study asset prices, focusing on linkages between financial markets and the broader macroeconomy. Recent work looks at uncertainty and risk tolerance in asset pricing behavior; he’s also developed methods to analyze and account for the uncertainty of the households and businesses that populate economic models, and also for the uncertainty that econometricians have about the adequacy of their models. Related research examines policymaking under uncertainty.

At the top of his field, there are few major awards that he hasn’t received, but he’s especially pleased with his role in developing future generations of gifted economists. He is a major force behind the Becker Friedman Institute at the University of Chicago, a multidisciplinary research center that supports young scholars and others in economics, law, public policy and business. And he devotes a full page of his CV to a list of the Ph.D. students he’s advised.

“I’m proud of my research accomplishments,” he observes in this Region conversation, “but I am also proud that I am able to associate with so many very good graduate students.”

Interview conducted November 4, 2015

The generalized method of moments and its application

Region: Let’s start, if we could, with asset prices and the generalized method of moments, known as GMM. When you were honored with the Nobel award in 2013 for your work on the GMM, the committee said that the GMM is “particularly well suited to testing rational theories of asset prices.” Could you give a brief description of the method and its importance to later work?

Hansen: Sure. I like to think about this as the following: Economists can build a full-scale model of a macroeconomy with many different equations, where they map out a really rich structure of financial markets and the macroeconomy. The question is: Is there a way that you can study the connections between asset prices and macroeconomic outcomes without necessarily getting all the other details exactly correct? For instance, an econometrician may wish to avoid specifying the precise information used by investors or detailing all of the ingredients that govern the evolution of the macroeconomy. My aim was to develop methods that allow for initial investigation of linkages without requiring a fully fleshed-out model.

Lars Peter Hansen
Photo by Peter Tenzer

Region: Which, of course, is always a challenge at the initial stages of model development.

Hansen: Right. The point is not even to make a pretense of believing that everything is fully specified. The econometrician’s jargon of a “partially specified model” applies where one tries to study linkages without having to spell out all the various different details. As you build models to use for policy purposes, you do have to specify many of those details. But as an initial step, it’s nice to be able to make assessments without all that detail. I like to say that the GMM allows you to “do something without having to do everything simultaneously.”

There’s a long history in formal econometrics behind this approach. Some of my research had an important antecedent, namely, the work of Denis Sargan back in the late 1950s. I was also very heavily influenced by lectures Chris Sims was giving back when I was a graduate student. While developing a formal justification for an econometric method, what excited me and helped my research gain some traction were the empirical applications I and others came up with.

Is there a way that you can study the connections between asset prices and macroeconomic outcomes without necessarily getting all the other details exactly correct? My aim was to develop methods that allow for initial investigation of linkages without requiring a fully fleshed-out model.

Region: What do you think was achieved by applying these methods, in your own work and in work by others?

Hansen: My initial applications included analyses of forward exchange markets with Bob Hodrick and investigation of pricing a cross section of asset returns with Ken Singleton. The Hodrick paper documented empirical challenges that have altered how researchers model exchange rate determination. The Singleton research exposed gaps in the existing macroeconomic models in terms of their implications for asset pricing. This work in turn encouraged Ravi Jagannathan, John Cochrane and me to characterize asset-pricing puzzles in more general terms. Scott Richard and I began to explore more abstract economic formulations of so-called stochastic discount factor models, where such factors simultaneously discount the future and adjust for risk. The empirical challenge to model builders is to understand why the implied risk prices are large and fluctuate over time in interesting ways. It’s been rewarding to observe the subsequent model extensions and refinements motivated by empirical challenges.

Asset prices and the macroeconomy

Region: Asset prices are known to be forward-looking and, as such, they contain information about private sector beliefs. How does your work bear on this topic?

Hansen: Researchers have indeed focused on the forward-looking nature of asset prices. Prices today depend on what people think is going to happen in the future. For instance, I may have observations on the market values of some type of underlying dividend or cash flow process. Or I may make assessments of the market value of education, depending in part on guesses as to what the salary will be for different levels of education going forward.

Asset prices, broadly conceived, reflect people’s beliefs about the future, but they also reflect how people respond to uncertainty. What makes people cautious and what makes them bold? From this vantage point, the challenge of an empirical asset-pricing model is to disentangle the contributions coming from beliefs versus those coming from concerns about uncertainty. And I’ve been keenly interested in methods that allow us to try to disentangle these two different impacts.

This aim to extract information from forward-looking prices is not just an academic exercise; it’s on the radar screen of policymakers as well. For instance, those engaged in monetary policy look to financial markets to try to see where the private sector thinks the macroeconomy is headed. So it’s very interesting to ask what types of restrictions we impose on models that allow us to disentangle the contributions of private sector investor beliefs from adjustments made for the uncertainties that they perceive. And this has been a long-standing interest of mine.

The challenge of an empirical asset-pricing model is to disentangle the contributions coming from beliefs versus those coming from concerns about uncertainty. And I’ve been keenly interested in methods that allow us to try to disentangle these two different impacts. This aim to extract information from forward-looking prices is not just an academic exercise; it’s on the radar screen of policymakers as well.

Region: What methodological approaches have you recently developed to further the understanding of linkages between asset pricing and the macroeconomy?

Hansen: The empirical finance literature has focused on risk-return trade-offs over a quarter or a month, or sometimes even shorter horizons. But pricing implications extend over much longer horizons, and that’s the part that has been less well-characterized, although it’s important for understanding investment decisions and has macroeconomic implications.

I’ve been working with collaborators, including José Scheinkman and Jarda Borovička, to expand the existing tool kit for analyzing this problem. We take our inspiration from two sources: First is from a literature on impulse-response functions. This methodology was introduced to economists by Ragnar Frisch and has been used extensively in empirical macroeconomics. It aims to quantify the different sources of economic fluctuations. Formally, it measures how important alternative macroeconomic shocks are in driving asset prices over different horizons. The shocks themselves hit the economy the next quarter, two quarters down the road, three quarters down the road and so forth. Then you trace through the dynamic responses of economic variables to the alternative macroeconomic shocks over different horizons. These impulses can build over time and reinforce each other, or they can diminish over time. There’s been substantial empirical work that characterizes those dynamic patterns.

The second source of inspiration is from an asset-pricing perspective, where we explore the returns investors must receive as compensation for their exposure to these macroeconomic shocks. Valuation of the cash flows must account for their exposure to macroeconomic shocks. Much like impulse-response functions, decompositions of these compensations by the investment horizon are revealing. Suppose the shock happening in the immediate future affects the cash flow the next quarter, two quarters down the road and subsequent quarters—from an asset-pricing perspective, we measure the implied market-based compensations for those exposures. This gives us a richer understanding of alternative models and a richer perspective from which we can look at linkages between financial markets and the macroeconomy.

The resulting quantitative methods allow us to measure the impact of uncertainty as it compounds over time. We have a well-known notion of what happens when we compound interest; seemingly small things can grow into very, very big things. Something similar occurs in the case of uncertainty. As uncertainty compounds over time, random outcomes that don’t originally look to have big consequences can grow in magnitude as they play out. This research provides a way to quantify such phenomena—to actually measure this compounding uncertainty and see how it might be reflected in the behavior of asset prices.

Just as growth rate uncertainties compound over time, so do the market compensations for those uncertainties. Alternative economic models have interesting things to say about how compensations differ across alternative investment horizons.

Uncertainty and market returns

Region: To measure the impact of uncertainty and understand how it compounds over time, how do you capture the link between uncertainty and financial market returns?

Hansen: Risk premia, which are measured compensations in financial markets, can be large for one of two reasons. One is that exposures to risk are larger, and the other is that the prices of those exposures, the market-based compensations, are larger. I think about the former as a quantity effect, the latter as a price effect. This dichotomy isolates the measurement challenges.

From a quantity perspective, we ask, how exposed are the economically relevant cash flows to uncertainty? How do we measure the amount of uncertainty out there in the underlying economic environment? How much uncertainty is there about specific economic outcomes, or about the macroeconomy in general? This is the quantity channel.

From a price perspective, we ask, how do decision makers or market participants react to this uncertainty? What are our attitudes about uncertainty, and how much aversion do we have to it? These attitudes show up in how markets compensate people when they are exposed to different types of uncertainty. All the time, we’re talking about financial markets being cautious or bold and risk premia being high or low. Given such observations, the measurement challenges are twofold. One is, how much uncertainty is there that we’re exposed to? And the other is, how do people react or respond to that uncertainty?

Region: What are some of the motivations behind this research?

Lars Peter Hansen
Photo by Peter Tenzer

Hansen: I believe that understanding the mechanism connecting long-term concerns to short-term consequences for financial market compensations is a very important endeavor. Amir Yaron, a former student of mine, in some intriguing collaborations with Ravi Bansal, explored the consequences of what they referred to as “long-term risk” for asset pricing. They considered alternative models of risk preferences for investors suggested in previous research and hit upon situations in which investors’ perceptions about long-term uncertainty—in, say, macroeconomic growth—had implications for even short-term pricing of securities. By short-term pricing here, I mean the short-term measures of the so-called risk-return trade-off—a common target for empirical work in finance. Thus, they described interesting linkages between long-term uncertainty and short-term implications. Their work has been challenged empirically in part because quantifying long-term uncertainty is difficult. Nevertheless, I was very much intrigued by their finding of the potential for this linkage to be important. This is one of the reasons I have been working on novel characterizations of asset-pricing models that look across different time horizons and what happens as we change the horizons.

I’ve been very interested in developing tools to think about this problem and to explore models in which there can be long-term uncertainty, but approached with a more eclectic approach to uncertainty. My aim is to circumvent some of the criticisms of the work of Bansal and Yaron, while still getting at similar quantitative impacts.

The importance of uncertainty

Region: Some of your more recent work focuses on the effects of uncertainty, broadly conceived, on asset prices. What led you to see uncertainty as an important question to explore?

Hansen: When I was initially doing some of this research jointly with Ken Singleton, we used “off-the-shelf” macroeconomic models of the time. They were so-called rational expectations models, but they had other features as well. Imposition of rational expectations has been a demonstrably successful way of removing arbitrariness from specifying beliefs. As a simplifying assumption, it presumes that investors assign probabilities with complete confidence in ways that are consistent with the underlying economic model. It has been a powerful modeling tool in a variety of settings. We imposed rational expectations on the part of investors in our use of time series evidence, and we structured an econometric approach that was tractable while exploiting this restriction.

There’s lots of uncertainty about future macroeconomic growth. ... In my view, it is critical to think about both the magnitude of growth rate uncertainty and how this uncertainty affects decision-making.

We originally envisioned the resulting econometric method as providing a way to estimate some key parameters such as how risk-averse investors are or how they view intertemporal substitution in consumption. It was intended as a formal way to obtain inputs that we could start plugging into bigger models.

That wasn’t how the research played out. Instead, we and others ended up exposing the problems associated with different classes of models. As I mentioned before, my co-authors and I were led to characterize empirical puzzles and challenges that the next set of models would have to confront.

Empirical puzzles only have meaning relative to a class of models. There is now a body of evidence suggesting that there are periods when financial markets look very cautious or “risk averse,” as reflected by the implied market compensations. And the types of models we were initially looking at were just not capturing this phenomenon. This evidence led economists to think about what would lead to that type of behavior and how one can make the model richer to get a much better characterization of these linkages between financial markets and the macroeconomy.

I believe that exposing empirical challenges encouraged other researchers, myself included, but also a whole variety of other people to really think hard about other richer models of asset price determination and to see to what extent they could help us better understand these various empirical phenomena. I became intrigued by rethinking how we model investor responses to uncertainty.

Region: What is your own thinking on these empirical challenges?

Hansen: There have been a variety of modeling extensions that I find interesting related to richer models of investor preferences and alternative market imperfections. Tom Sargent and I became actively interested in how the impact of uncertainty about the macroeconomy affects both how financial markets work and design of economic policy.

There’s lots of uncertainty about future macroeconomic growth. For instance, economists and others debate whether we’re currently in a period of secular stagnation or whether we’re in the process of growing our way out of it. They debate whether the technological advances of the last couple of decades were special in terms of their potency. Going forward, will technological advances proceed at a much slower pace, or will there be new advances that have a major impact on future opportunities?

In my view, it is critical to think about both the resulting magnitude of growth rate uncertainty and how this uncertainty affects decision-making. From the vantage point of the private sector, trying to figure out whether now is a good time to invest in new projects or to fund new enterprises, investors are led to speculate how the future of the macroeconomy is going to evolve. This challenge in responding to uncertainty is reflected in the behavior of financial markets. And there have been recently some modeling advances in how decision makers respond to uncertainty and, more specifically, what the consequences are for the compensation observed in financial markets and the extent to which this uncertainty induces more cautious behavior. Interestingly, these advances have come from different disciplines, including statistics, control theory and decision theory.

As I mentioned previously, much economic analysis targets a particular view of risk. This perspective is one in which people know probabilities of future events, but they don’t know outcomes. I prefer to think of things in broader uncertainty terms, where they don’t know outcomes, but they also struggle with probabilities to assign to those outcomes. And when the private sector engages in these struggles, that shows up in the behavior of forward-looking securities market prices, and it shows up in the resulting investment behavior. The methods that I’ve been working on are designed to turn these kinds of qualitative descriptions into quantitatively meaningful characterizations of the importance of uncertainty.

Risk versus uncertainty

Region: Could you elaborate on this distinction between risk and uncertainty? How does it differ from the more traditional way of understanding risk and uncertainty?

Hansen: So I like to draw a distinction between them in this way. Imagine that we’re rolling the dice every period, and over time we continually roll the dice. Rolling dice is like a game of chance. We know the odds of getting different numbers, and we can figure out how that compounds as we go, across multiple rolls of the dice. But in more complicated situations, we don’t know the probabilities themselves, and part of what we’re doing is using evidence to figure out the probabilities—in our role as statisticians—but even to do that, we have to have some reasonable ways to think about the probabilities.

An approach that reflects a struggle, similarly faced by econometricians in building and analyzing dynamic economic models, will add a richness to how we capture the behavior of investors inside the models we build.

The more complex the underlying economic environment—the macroeconomy, for example, or alternative financial markets—the more challenging it is to make fully probabilistic assessments. So it’s important to go beyond this vision of a game of chance, where we know probabilities, into something in which the decision makers themselves are trying to make guesses about the right way to think about the uncertainty.

Region: This relates, I think, to your Nobel lecture in which you speak at length about “inside” and “outside” uncertainty: the inside uncertainty of actors who populate economic models—households, businesses and policymakers—and the outside-the-model uncertainty of econometricians who’ve built the models, but do not know all of the parameters or aren’t certain they’re right. Could you elaborate on the distinction between inside and outside uncertainty?

Hansen: Let’s go to outside-the-model uncertainty first. If we’re given some dynamic economic model that represents our understanding of the economy (this is the perspective that a statistician would take), then the researcher is going to take that model with unknown parameters and use data to figure out what those unknown parameters are. There may be multiple models, so econometricians may have to assess whether it’s a good or bad model, or whether they may want to compare alternative models. In assessing economic models, the researcher has some uncertainty about how well particular models, or combinations of models, represent the economy, or whether those models are even correctly specified.

But when we’re building economic models, we have to take a stand on what’s going on inside the models, too. We have to depict the economic actors, the consumers, the enterprises and the policymakers—they’re coping with uncertainty. We then deduce consequences for market outcomes and respect their attempts to cope with uncertainty.

When we’re thinking about uncertainty, it’s useful to keep both of these vantage points in mind. Historically, one approach is to assume that inside actors know what the risks that affected them were as captured formally by a probability model. But realistically, they may not know how to model the risk, and they may react differently to alternative forms of uncertainty. This suggests that an approach that reflects a struggle, similarly faced by econometricians in building and analyzing dynamic economic models, will add a richness to how we capture the behavior of investors inside the models we build.

Lars Peter Hansen
Photo by Peter Tenzer

Long-term uncertainty and short-term behavior

Region: How does this work of yours approach the challenge of understanding when long-term uncertainty affects investors in the short term? How do you understand that connection?

Hansen: Investors who are in the financial markets may well not know what’s going to happen to the macroeconomy four, five, 10 years down the road. That can affect their current period decision-making and how they allocate financial resources in both the short term and the long term. Thus, the struggle with what’s going to happen over longer time periods affects what goes on in even short-term behavior inside financial markets—how financial market compensations for uncertainty fluctuate over time.

Asset markets may behave cautiously today in part because the participants are concerned about where the macroeconomy is going to be in five and 10 years. So what does it take for the future to matter a lot for decisions I make today about short-term investments? In financial markets, you always have this possibility that you can make an investment today, but if things are liquid enough, you can undo it tomorrow. So you can afford to take a short-term perspective as well. But it can still be the case that people’s concerns about long-term behavior of the overall macroeconomy can affect even that short-term decision-making.

Region: Do you think that the average investor thinks in terms of long-term and short-term behavior of the economy?

Hansen: Once we came out of the financial crisis, I think there was lots of uncertainty about how quickly the economy was going to recover. Some people thought it might recover fast; some thought it might recover slowly, and that thinking impacts what type of new investment projects people engage in. Should we hold back and be cautious now, or is now a good time to invest in new business and new enterprises, when we’re not really sure where the economy is going to be in the future, long term?

There are all sorts of possible sources of long-term uncertainty, from fiscal challenges to what’s going to happen to technological growth over long horizons or, for that matter, the economic impact of climate change and, conversely, the impact of climate change on the economy long term. Of course, measuring these with any degree of accuracy is enormously challenging. What has been fascinating to me is both having the methods to think about this problem in a very systematic way and exploring what happens when we endow investors with less confidence in their abilities to make precise assessments of the nature of these long-term uncertainties. And then the aim is to still get interesting and fascinating models whereby this becomes a reason for why financial market prices are high sometimes and not other times.

Robust control theory

Region: Let me turn to a related topic, robust control theory. You and Tom Sargent pioneered its application in economics, and in your 2007 book with Tom, you wrote that there’s been a long tradition in economics of framing macro policy rules “in light of doubts about model specification.”

Robust control theory, as I understand it, is about designing rules that will work under a range of alternately specified models because policymakers, or econometricians themselves, are uncertain about those models. Is robust control theory an effort to address those doubts about model specification?

Hansen: The term “robust control theory” comes out of control theory in engineering. In my work with Tom and in work by other economists, we find it very handy to build on insights that have come out of the control theory engineering literature. This work has been targeted toward solving practical problems, where you have to develop methods that are numerically attractive as well as revealing—that is, that we can attach meaningful interpretations to them. We’ve also found guidance coming out of decision theory within economics and out of statistical decision theory. Robust Bayesian methods, for instance, have a very similar flavor to them.

I find it interesting to draw insights from all these different literatures and try to make connections and then build on them. Economic models are different from standard engineering models, so it’s not as if we can just take their models, put in economic data and turn the crank, but they’ve had some very insightful ways to look at problems.

Robustness becomes relevant when we put multiple models on the table. We may not trust any of the given models that we have, but they are our best guesses. So how do we use models without taking them too literally or too seriously and still make them serve as useful guides? Many policymakers, once they’re outside the public realm, actually go through informal exercises similar to the following: “This is one view or model of the world. Or it might be another model. Suppose we consider a course of action; how might it work under the different models? Does one course of action seem to work pretty well across different economic models?”

This, I believe, is a reasonable alternative approach to presuming, “This is the model, I know it to be right and I’m going to go with the answer from this model.”  Of course, we may wish to weight the alternative models as suggested by Bayesian decision theory, but there is still a challenge as to what weights to use and how sensitive the policy prescription is to this weighting. 

I worry about the pressures that policymakers face to project a lot of confidence in a precise rationale when they communicate with the public. Moreover, some economists who want to influence policymakers are led to project their own viewpoints with incredible certitude, so that the policymakers can then turn around and present these same viewpoints to the public. This outcome often overstates the certainty of our underlying knowledge base in ways that can be socially unproductive.  

The simple fact that there are uncertainties—about the exact magnitude, the timing, how a problem might unfold—doesn’t mean it isn’t prudent to act now. It just influences what things are sensible to do now. ... I think a lot of the fear of acknowledging uncertainty is fear that it will lead to inaction, and I don’t think that’s a correct assumption.

There’s a lot of concern about acknowledging uncertainty. For instance, I see this in some of the discussions of climate change. People are concerned that, once we acknowledge the scientific uncertainty, the conclusion the public will reach is, “Well, therefore, we shouldn’t do anything because we’re uncertain of the exact human influence on the environment going forward.”

The simple fact that there are uncertainties—about the exact magnitude, the timing, how a problem might unfold—doesn’t mean it isn’t prudent to act now. It just influences what things are sensible to do now. One could say, “Well, we should wait until we get more knowledge.” But if we wait until we get more knowledge, it may be very, very costly or impossible to intervene at that point in time. So I think a lot of the fear of acknowledging uncertainty is fear that it will lead to inaction, and I don’t think that’s a correct assumption. Maybe it will lead to doing some simple things now until we understand mechanisms more clearly, but it doesn’t necessarily lead to the conclusion that we shouldn’t do anything.

Region: Related to this point about the projection of certainty, your closing observation in your Nobel lecture was, “Uncertainty generally conceived is not often embraced in public discussions of economic policy. When complexity is not fully understood by policymakers, perhaps it is the simpler policies that are more prudent.” Can you elaborate on that? I found it a little provocative.

Hansen: [Laughs.] Well, this builds on long-standing discussions that economists have had in other contexts. A long time ago in a rather different monetary policy environment than we have today, Milton Friedman argued that we need to have simple policy rules because there are long and variable lags in the transmission mechanism through which changes in monetary policy influence the macroeconomy.

I would interpret that as saying, “We don’t really understand the details of the transmission mechanism, and maybe there are dangers in pretending we know too much. Instead, maybe we ought to be doing something simple, to avoid taking too literally the complexities that a given model might have.” Once I start looking across models and recognizing potential differences or similarities across models, I might well be led to do something relatively simple that works across the models rather than embracing some particular, more complicated approach based on taking this one model literally.

So my statement was meant to get at that type of idea, but I do not have a formal statement or proof of such a proposition. Let me also add that there are simple rules that are good, and simple ones that are bad, so simplicity is only part of the story. In the statement that you quote, the reference is to simplicity being a possible outcome that I think we ought to seriously consider. It is not a statement that, in all sets of circumstances, one should necessarily do something simple.

There are fascinating links between complex environments and how much uncertainty you have. The more complex the environment, the harder it is for us to understand all those complexities, and the more we have to deal with some form of uncertainty as we think about the environment. It can be problematic when policy complexity itself adds to the underlying uncertainty in the economic environment and opens the door further to regulatory discretion. 

“Systemic risk”

Region: During development of the Dodd-Frank Act, a policy effort to prevent future financial crises, there was much discussion of “systemic risk.” In a 2013 paper, you wrote about the challenge of identifying what that is and then quantifying it as a first step in addressing it. Do you believe we have a better sense of “systemic risk” than we did previously?

Hansen: The term “systemic risk” has an interesting history. If you go back 10 years and do some Google searches for it, you won’t find out that much. Post-financial crisis, it’s become kind of the buzzword or grab bag, so to speak, that people use to rationalize a variety of interventions in financial markets. As we sort through the various empirical evidence connected to this financial crisis and as we revisit other financial crises, we will get a better idea of a variety of constructs connected to so-called systemic risk. But most likely, it’s going to be the case that there isn’t some single way to measure systemic risk.

To really support measurement, we’re going to have to add more specificity and think about it in different ways and then do the quantitative assessment of which of these different channels really turns out to be the most important one. I think that’s where some of the quantitative ambition going forward can be really very powerful.

So I’m not very optimistic that we will be able to say, “Here’s the one measure of systemic risk; let’s go with it.” But I do think that we’ve made some limited progress in thinking about where regulatory oversight of financial markets is crucial and where it can be counterproductive. Better and more meaningful quantitative measurement is a fruitful future endeavor.

The Macro Financial Modeling project

Region: Systemic risk is one of the topics examined in the Macro Financial Modeling project that you co-direct with Andrew Lo of MIT. What are some of its goals? What kind of progress has it made?

Hansen: After the financial crisis, there was a very big push among research departments at central banks, various government agencies and in academia to take some of the existing modeling efforts in macroeconomics and make some modifications. A lot of these initial modifications were more like quick fixes. Model builders connected to governmental agencies were under pressure to get answers quickly, because they had to provide immediate guidance for policy choices. As a short-run response, that seemed perfectly reasonable and perfectly natural.

Andy and I, along with many distinguished economists in both macroeconomics and finance, collectively asked, “Isn’t now a good time to think more systematically about quantitative and empirical methods that will provide guidance not only for monetary policy, but for financial market oversight in the future?”

There was a variety of ways to rethink macroeconomic modeling and linkages to credit and other financial markets, including focusing on alternative mechanisms that hadn’t previously received much attention. Also, adding quantitative ambition to qualitative modeling insights seemed critical in pushing such research forward.

We have the luxury of taking a longer- term perspective on this research problem; whereas, some of the research departments in the various central banks and regulatory agencies had to provide quick responses. We hope to accomplish a few things. First is to bring together people from the fields of finance and macroeconomics and engage in more conversation. Another is to nurture a new cohort of researchers just coming into this area with quantitative and empirical ambitions.

As we first got started, we received lots of sympathetic responses from the leaders in research departments, the public sector and central banks who participated in some of our conferences. That was very important because it helped expose academics to the types of challenges that public sector economists were facing. The resulting exchanges have helped to guide the research questions. This aim is to encourage the next step of constructing models that are richer along some dimensions, to build on the successes of the previous modeling ventures and also to expose some of their failures.

Passing along lessons and future goals

Region: Your research career obviously comprises a wide range of interests and discoveries. How do you try to pass on the lessons you’ve learned to your students?

Hansen: I feel very lucky to have had just an incredibly rich array of graduate students. Right after the Nobel Prize announcement, before we went to Stockholm, a bunch of my former graduate students put on a conference here. It was extremely exciting to see the breadth of what they were doing. They weren’t like Lars Hansen clones; they were off doing their own thing.

I suspect my influence was not always direct because often I wasn’t telling them what to work on. Maybe I was giving them an interesting perspective on important problems to work on, and maybe I was trying to convince them to take modeling seriously. Perhaps I pushed them in some ways, but the exciting thing was just to see the breadth of their range of accomplishments. I’m proud of my research accomplishments, but I am also proud that I am able to associate with so many very good graduate students.

Region: I know you have a long list of them in your CV.

Hansen: There’s a Macro Finance Society that was created a few years ago, trying to nurture research in this area, especially that from younger scholars. And a fair number of the founding members of this group are my former students. These days, I have both students and “grandstudents”: students of students. It’s very hard for me to acknowledge my age in all this, but it’s just fun to see people that I had the opportunity to teach who are doing so well, enriching a field that, in many respects, barely existed back in the early ’80s.

Region: So, in closing, what should we expect to hear from Lars Peter Hansen in the next five to 10 years?

Hansen: This answer should be obvious. It is UNCERTAIN! I look to work on a variety of exciting projects, but I continue to find research in economic dynamics with quantitative ambitions to be fascinating. I hope to push further along some of the topics covered here related to the impact of uncertainty on economic analysis and on the shaping of prudent economic policies.

More About Lars Peter Hansen

Current Positions

David Rockefeller Distinguished Service Professor, University of Chicago, since 2010; Professor in Statistics, since 2007; on faculty since 1981

Director and Chair of Research Council, Becker Friedman Institute for Research in Economics

Previous Positions

Associate Professor, GSIA, Carnegie-Mellon University, 1980-81; Assistant Professor, 1978-80

Professional Affiliations

Co-Editor, Econometrica, since 2012; 1986-90

Vice President, American Economic Association, 2011

Honors and Awards

Honorary Academician of Academia Sinica, 2014

Bank of Sweden Prize in Economic Sciences in Honor of the Memory of Alfred Nobel, 2013

Distinguished Fellow, Macro Finance Society, since 2013

Frontiers of Knowledge Award in Economics, Finance and Management, BBVA Foundation, 2010

MSRI Prize in Innovative Quantitative Applications, CME Group, 2008

Fellow, American Finance Association, since 2007

Erwin Plein Nemmers Prize in Economics, Northwestern University, 2006

Fellow, National Academy of Sciences, since 1999

Member, American Academy of Arts and Sciences, since 1993

Fellow, Econometric Society, since 1985; President, 2007; First Vice President, 2006; Second Vice President, 2005

Frisch Medal, Econometric Society, Co-winner with Kenneth J. Singleton, 1984

Publications

Author and co-author of numerous papers on econometric methods, asset pricing and financial markets, uncertainty and risk aversion, robust control theory, and macroeconomic risk, growth and fluctuation

Education

University of Minnesota, Ph.D., economics, 1978

Utah State University, B.S., mathematics and political science, 1974

For further background, visit larspeterhansen.org.