Saturday, May 12, 2012

A good point on J.P. Morgan

(If you're still getting caught up on the JP Morgan story, you should probably go by Marketplace and check out Heidi Moore's explanation of the fiasco.)

I don't have a direct source for this other than that I heard it on either Marketplace or All Things Considered, but a financial reporter made an observation I've been waiting to hear put concisely since this story broke.

The reporter explained that the group that had the huge recent loss had been given a dual mandate: hedge against losses and make lots of money. The reporter then wondered if assigning those two mandates to the same team was a good idea.

I wonder if this is an example of something I've seen before or at least if something I've seen before might have contributed to it. One of the recurring themes in these bad-finance stories is the arrogant dismissal of seemingly obvious points, things like "it's hard for a hedge to provide protection against big, risky bets going bad when it's also a big, risky bet." Part of this arrogance may come from people's mistaken belief in their own sophistication.

There's a common fallacy often encountered by people who build models and handle data. It's the belief that complication implies sophistication. The truth is largely the opposite: complication amplifies naivety. When people use complicated, impressive-sounding systems they tend to be less concerned with common sense questions like, "is my sample representative?" or "Are the relationships we're assuming stable? Are they likely to break down under extreme conditions?"

In the period leading up to the crash, we heard a great deal about how sophisticated Wall street and the financial sector had become, particularly with respect to risk. I wonder if we, in fact, saw just the opposite. Did the rise of the quants and the reliance on elaborate models simply enable naivety and wishful thinking?

I'm not claiming that this was the (or even a) primary cause of the crisis. I would put before it, in no particular order, greed, misaligned incentives, deregulation, the growth fetish and possibly a few other candidates. Still, I think it's fair to say that things were made worse by the belief that having complicated formulas to deal with risk somehow meant you were safe from it.

Update -- I took advantage of the ten minute blogging rule (you have ten minutes after posting to change things) and added the third paragraph.

2 comments:

  1. Another reason I heard from someone a while back is that some of risk models being used weren't being used properly.

    Let's say someone creates a prediction model with 12 independent variables. And let's say 1/2 of those variables are little murky (expensive or hard to properly determine). But you go ahead and run your prediction using the 6 variables that you do know.

    Don't be surprised when the prediction is wrong...very wrong.

    ReplyDelete
    Replies
    1. Garbage in/garbage out is definitely one of those common sense rules I was thinking of.

      Delete