VAR, or Value at Risk, essentially refers to the amount of money that could actually be lost on a given trade from a given portfolio, determined using mathematical models and statistical probabilities.
Q. This is at the heart of the JPMorgan story. We know it stands for “value at risk”, but what else do we need to know?
A. First, VaR is supposed to answer a basic question: how much can I lose on a trade? Or, more generally for bigger institutions, looking across my portfolio, what’s the most I have at risk in a given day? Pretty ironically, JPMorgan (pre Jamie Dimon) was really that big financial institution that popularized this method of risk management.
Q. So, how does it work, exactly?
A. Well there are different kinds of VaR models ree basic ideas: variance- covariance, monte carlo simulations, and historical references, and all sorts of variations and customizations. But they’re all mathematical formulas that are supposed to take account of all the liklihood of losses, and the size of those losses, across a portfolio, given things like possible interest rate changes, equity market volatility, factors like that.
Q. But obviously, something went wrong at JPMorgan…
A. Well, its been widely reported that they had changed their VaR model recently, and there seems to be some finger pointing around that. But that discussion completely ignores the more important point that these mathematical models will never fully describe all the risks in a portfolio.
Q. Ok, I’ll bite. Why not?
A. The models are only as good as the probabilities, and the data, that are fed into them. And even if they’re good at predicting, say, what the price of the CDX should be on a given day, how much it “should” move, they can’t take account of the idea that trades will overwhelm the market and make it no longer an effective gage of prices. At a very fundamental level, there are profound limitations to using mathematical models as a substitute for judgment and management. =