This volatility is to low
This volatility is too high
This volatility is … holy cow … the SNB just removed its currency floor?
Investors in the big trading banks are spending the week discovering that financial institutions are the Goldlilocks of the volatility world after the likes of JPMorgan, Goldman et al released some disappointing fourth-quarter earnings despite an uptick in vol. With banks having spent much of the past five years moaning that volatility was too low to eke out trading profits, it turns out that in the fourth-quarter volatility was just a bit too high to make money. Or not quite the right kind of volatility. Or whatever. Read more
JP Morgan’s always interesting Flows & Liquidity team have weighed in on the great Japanese yield panic. Japanese government bond yields have jumped since the Bank of Japan launched QE on steroids at the start of April and volatility has risen with them — the 60-day standard deviation of the daily changes in the 10 year JGB yield jumping to 4bp per day, the highest since 2008 (that’s longer term yields on the right for a bit of context):
That has understandably scared people who remember the volatility-induced selloff shock of 2003. From JPM: Read more
…it was just someone using Excel on a laptop who was highlighting cells for a formula and released his index finger from the left-clicky button of his mouse too soon.
Writes Irish stand-up comedian Colm O’Regan for BBC Magazine, in his piece about “The mysterious powers of Microsoft Excel”. As you will likely have guessed, his article was inspired by spreadsheet blunders in Reinhart and Rogoff’s 2010 Growth in a Time of Debt paper. Read more
Mr. Hagan had never previously designed a VaR model. According to JP Morgan Chase, having an employee from a business unit design the unit’s risk model was somewhat unusual, but it did not violate bank policy…
Mr. Hagan told the Subcommittee that he was told the objective of his research was to design VaR models that, when fed into the RWA model, would produce lower RWA results for the CIO, since both he and the CIO traders viewed the bank’s standard RWA model as overstating CIO risk.948
Apologies for all the jargon there. The above is from the Senate Permanent Subcommittee on Investigations staff report on the JPMorgan “Whale trades” that lost the bank over $6bn in 2012.
While the portfolio that lost all those billions was comprised of outsized credit derivatives trades, building up such positions wouldn’t have been possible without a significant change in a key risk model that the bank was using — the Value-at-Risk (VaR) that predicts possible losses over a given time horizon. Read more
On Monday, the Office of the Comptroller of the Currency and the Federal Reserve issued “enforcement actions” against JPMorgan, which makes it sound a lot more exciting than it is.
The slaps on the wrist for the “London whale” trades, and failures concerning anti-money laundering procedures, come with no fines and no admission or denial of any wrongdoing. The Fed does, however, reserve the right to take further action and the UK’s Financial Services authority said it’s still looking into it. Read more
Buried in Morgan Stanley’s decent third-quarter results (excluding the absurdity that is DVA of course) is this intriguing footnote:
Morgan Stanley’s average trading Value-at-Risk (VaR) measured at the 95% confidence level was $63 million compared with $76 million in second quarter of 2012 and $99 million in the third quarter of the prior year. The Firm modified its VaR model this quarter to make it more responsive to recent market conditions.
A golden quote, delivered by the maestro and Nobel laureate to Laurie Carver of Risk magazine. Harry Markowitz slips in a Rodney Dangerfield impersonation while talking regulation of banks’ trading book risk here:
Optimising mean return subject to variance beats doing it with respect to all the other risk measures, including VAR and expected shortfall – and VAR is the worst of them. The Basel Committee are not the first people to do something wrong because they haven’t read my work. They are misguided and I know who misguided them – people who have PhDs in mathematics or physics, who tell them that is what the experts use. But the so-called experts haven’t read Markowitz – I get no respect. Read more
JP Morgan’s second-quarter 10-Q is out – and so is its restated filing for the first quarter.
Of course, the bank has already opened the kimono (as Jamie Dimon might say) on the unwinding – and transfer to its investment bank – of the synthetic credit trades built up by its Chief Investment Office. Read more
Why is CreditSights highlighting JPMorgan’s late-2008 shift from a 99 per cent to 95 per cent confidence interval* in its Value-at-Risk measurement, here?
Hopefully that headline gets your attention for the Basel Committee on Banking Supervision’s latest review of capital rules for banks’ trading books.
There is a lot in it — the Committee has been tinkering with trading books since the crisis exposed serious mismatches between the capital that banks’ models said they needed for trading structured credit, and the losses they ended up experiencing. In fact this review follows up on the 2009 rule-set dubbed ‘Basel 2.5′. Read more
Hark — the standard deviation devils sing (again).
As Reuters columnist John Kemp pointed out yesterday, recent swings in the commodities complex have produced some impressive probabilities figures. The kind you can wheel out in dinner party conversation. For instance, front-month Brent crude futures sank almost $12 per barrel (or over 9 per cent) on Thursday, leading the market down from over $120 at the start of the day to under $110. Read more
Glencore’s appetite for risk in commodities trading is bigger than that of leading Wall Street banks, according to information released by the banks underwriting the trading house’s multibillion-dollar flotation, the FT reports. The banks’ reports come ahead of Glencore’s IPO. The research reveals that Glencore could have lost a daily $42.5m last year on average when measured by so-called “value-at-risk,” much more than the average $25.7m put at risk each day in 2010 in commodities trading by Goldman, Morgan Stanley, Barclays and JPMorgan. The Telegraph adds that pre-IPO research written by Liberum Capital states that Glencore is at risk because of volatile commodities markets and falling margins that pose a “key investment risk.” The WSJ meanwhile reports that Asian and Middle Eastern sovereign wealth funds are in advanced talks for shares in Glencore’s listing.
And the carry trade/currency weirdness continues:
LONDON (Dow Jones)– U.K. bank Barclays Capital pulled yen prices off its Barx dealing system for a short period Wednesday, as the Japanese currency fizzed to its strongest levels on record, a person familiar with the situation said Thursday. In a spectacular move, the dollar collapsed against the yen at 2100 GMT Wednesday, sinking 4% to hit a record low of Y76.25. Read more
Oil price volatility is no doubt producing ample trading opportunities for many in the market, but as of Friday it has become much more expensive to take advantage of them.
The CME on Thursday announced it would be raising margins on trading crude oil by about 20 per cent for both speculators and hedgers, as of February 25. Read more
Another day, and another widening in the WTI-Brent front-month future spread — this time to what looks to be approaching record wides.
The spread hit as much as -9.50 on Monday and according to Bloomberg data the record for the differential stands at -10.67, as struck on February 12, 2009: Read more
Perhaps it’s not too astounding a finding…
But a Federal Reserve staff working paper by Dobrislav P. Dobrev and Pawel J. Szerszen has found that using historical high frequency data to forecast equity returns is far more effective than using general daily or monthly data. Read more
Goldman Sachs and Morgan Stanley each suffered at least 10 days of trading losses in the second quarter, underlining how turbulent markets have cast a pall on Wall Street since April, the FT says. The banks’ trading results deteriorated sharply from the first quarter of the year, before uncertainty about the US economy, European sovereign debt and the fate of new financial industry regulation sapped investors’ confidence. In particular, NYT DealBook says, losses on Goldman Sachs’s trading desks were over $100m on three days during the period that ended on June 30, exceeding the bank’s own VaR estimate.
It’s that time in the quarter when we get to see how successful Goldman Sachs has been at making money.
Without further ado then, here’s the 10-k chart that shows the frequency distribution of daily trading net revenues for the year ended December 2009: Read more
Here’s something you may have missed down Mexico way.
From the Wall Street Journal: Read more
Another bank result, this time from Morgan Stanley, now the second major US bank to report a(n albeit lower-than-expected) profit so far this fourth-quarter earnings season.
Analysts expected the Morgan Stanley to post earnings of 36 cents a share on revenue of about $7.8bn, which means the company has missed expectations with that diluted EPS from continuing operations of 14 cents, and net revenue of $6.8bn. Read more
Goldman Sachs’ Q3 daily trading revenues from the bank’s just-released 10-Q filing:
Here’s the fifth major US bank to report earnings this third-quarter season — Morgan Stanley with attributable net income of $757m.
The bank managed to eke out the profit, along with applicable diluted earnings per share 38 cents, on net revenues of $8.7bn. Analysts had expected EPS of 11 cents on revenue of $6.99bn, so this is a pronounced beat. It also means that the bank has managed to break a string of three straight quarterly losses. Read more
The BIS has released its analysis of proposed changes, adopted in July, to Basel II capital rules.
It’s basically an exercise in seeing just how much more capital banks will have to hold under the rejigged rules, which are due to come into effect in January. Read more
Alternate title: Building a better Monte Carlo model.
Risk managers and investors will, of course, be familiar with Monte Carlo simulations — which are used in finance to value potential loan losses and things like portfolio risk or derivatives. The precise simulation varies from model to model but in general work something like this: define a domain of possible inputs, generate random inputs from the domain, apply some algorithms and then aggregate the results. Read more
What happens when you get Rick Bookstaber and Nassim Nicholas Taleb in the same room, to talk about one of the most controversial risk measures of the financial crisis?
They (almost) agree. Read more
On Thursday, the US House of Representatives Committee on Science & Technology will turn its attention to financial modeling.
Specifically, the committee will be scrutinising the role of the much-maligned Value-at-Risk model, which was meant to measure the maximum loss on a banking portfolio at a given probability and time horizon, in the current financial crisis. Read more
Felix Salmon and Phorgy Phynance have a couple of very interesting posts on VaR.
Value at Risk models, which are meant to measure risk loss on a portfolio, have been grabbing headlines of late. Goldman’s VaR increased to $245m in the second-quarter of 2009. Barclays, we also note, appear to be ratcheting up the risk, upping its one-day VaR to a high of £118.7m in the first-half of this year. Read more
From the bank’s second-quarter 10-Q:
Comment columns are awash today with talk of Morgan Stanley’s trading prudence, as evidenced by its lower Q2 Value at Risk number.
For instance, here’s Graham Bowley at the New York Times: Read more
One of the centrepieces of Lord Turner’s review of banking regulation is the potential use of so-called stressed VaR — an attempt to rectify the failings of the ‘normal’ Value at Risk model, one of the pillars of upcoming banking regulations on market risk.
From the report: Read more