The Economist’s financial markets column, at which I, Baruch, have taken a swipe recently, is uncommonly interesting today. I have to comment on it, mainly because it touches on a number of things I have written about in the past few weeks, and foreshadows a blog post I was planning anyway, on the further implications of Nassim Taleb’s excellent book . The author of Buttonwood has clearly been reading Ultimi Barbarorum, and because of this I have to say, grudgingly, s/he may be making some epistemological progress.
Taleb is getting some mind-share. The Black Swan concept is catching people’s attention, and the book has clearly inspired the column. As we shall see, Buttonwood has probably not actually read the book, but it’s pretty certain he’s read a review of it, or someone mentioned it over lunch, which, as we all know is as good as reading the thing. “Terrorism is no longer much of a “black swan” event for markets,” says Buttonwood, marvelling at the lack of impact the recent self-combusting doctors in the UK have had on stocks. Are investors complacent? Niall Ferguson, for instance, who Buttonwood imagines should know, warns that a huge middle eastern conflagration is “much more likely than most people think”. Were this to happen, Buttonwood suggests, investors’ eyes would water.
Cue gleeful fantasies about a war with Iran, trillions wiped off the stockmarket, bankers rain down from tall buildings etc, you would think. Amazingly, Buttonwood instead seems to have gained some balance; he is using the arguments made in my latest post, which, I think, he would have had time to read before the column went to press. Complacency about a possible war in the middle east, (or a huge meteorite shower, or an attack of 50-foot killer tomato-women, adds Baruch), may “in an odd way, be rational,”
Betting on a black swan does not offer attractive odds. If a catastrophe only happens about 1% of the time. . . the investor will underperform the benchmark and lose clients. Staying fully invested, and waiting to get sandbagged by events «note this is exactly Baruch’s current investment stance», makes more business sense. After all, if a catastrophe does happen, the investor has a perfect excuse: nobody saw it coming. The chances are that everybody’s portfolios will suffer in tandem.
This is clearly a huge improvement on the column’s previous perma-bearishness, so much so it would be churlish to point out the second reason given for staying long (the comfort of the herd) is not a particularly positive rationale. Given the vast number of commentators, and some practicioners, out there who are negative on the markets, including the Buttonwood of last week, and who are all dying for the chance to say “I told you so”, it may not be true either.
Now, more seriously, Buttonwood has misunderstood the black swan concept. He does write for the Economist, after all. Serendipitously, however, correcting his error supports the wider point he is making. He is wrong in two ways: first, the fact of predicting a black swan falsifies it qua a black swan. It turns it into a “Gray Swan”, and as such it loses most of its impact. A black swan is not, as Buttonwood writes, merely “an event with a small probability of happening but a big impact of it does”. According to Taleb, a true black swan is impossible to predict (except with the vast benefit of hindsight), and we cannot make judgements about its probability — for all we know the next one may even be “probable”. They do have big impacts, but, and this is important, they are not always Bad Things; they are as likely to be Good. 9/11 is a classic black swan event, but so was the rise of the internet. Thus Buttonwood’s second error is not realising that the black swan might equally be something positive, something wealth-creating. This of course changes the expected profit-loss of the project dramatically; say for the sake of argument a black swan is indeed 1% likely in any iteration of Buttonwood’s particular example, but that 1% event, if it occurs, has a 50% chance of being fantastically wonderful (say, a return of plus 1000%)! The chances of catastrophe are reduced dramatically, from 1% to 0.5%, and the potential impact of the asymmetric outcome of that catastrophe (minus 100%, maybe even -1000% and you owe money) is probably at least offset sufficiently to negate it. If the 99%, non-black swan, business as usual outcome carries a return of say, -5% to +20%, I am in quids in, according to this particular ludic fantasy (of course real life would prevent us from being able to usefully guess these probabilities), and, as Taleb might not have put it, living in fill-your-boots-istan.
As an aside, my fund specialises in technology, which also includes alternative energy stocks. A war with Iran, involving the closing of the Gulf etc, could have an initially negative impact. But thereafter, the rising oil price would likely lead our mainly solar stocks to outperform hugely. Our technology stocks tend also to benefit from secular trends, somewhat insulated against the vagaries of the short term economic cycle. Short of an actual recession, they would probably outperform in a lower growth environment as well. So while I do not exactly pray for the US to invade Iran, I wouldn’t necessarily view it as a catastrophic black swan event for our portfolio, either.
The point about Taleb’s book, which I was hoping to make in an upcoming blogpost, should now be made here. The use of the black swan idea is not, as is commonly thought, to point out the dangers of doing anything risky which would expose us to the dark side of the swan. It is to show us that we really have very little idea, much, much less than we think, about what the future will be like. Taleb’s thesis is also that the future will come in great leaps, that we live increasingly in Extremistan, where grossly unequal outcomes predominate, and where power laws rule. Taleb also thinks taking risk is hard-wired into us; to avoid taking risk would be to deny our human nature. As he puts it, he does not want us to stop crossing the street; he wants us to cross the street without blindfolds. The practical conclusions of his ideas are simple, and prosiac. They can be summed up in few phrases: be prepared; observe! try and turn black swans gray; do not trust experts wielding regression analyses and standard-deviation based theories; try and put yourself in positions where you can benefit from black swans; take every opportunity that comes your way; beware of anyone who has a precise plan. And so on.
This leads us to another point in favour of the “wait, fully invested in tech stocks, to get sandbagged” position I have taken. Firstly, high growth sectors like mine are more likely than say a utility fund, or a convertible arb hedge fund, to find the next super-dominant growth story, the next Microsoft or Nokia with 5000% upside. We are all looking for them, but it doesn’t mean one day we won’t find one. Also, if I remain prepared for swans of different shades, I may be able to react faster than others when I see one. I may have a better chance than them of seeing the swan for what it is, before it slaps me upside the head. This is what I tell myself anyway.
So, to conclude, Buttonwood and the Economist may be getting better. Let’s see. But that is not the real point. I am still very aware that I am supporting the arguments of someone extremely skeptical about the ability of humans to have “adequate ideas”, as Spinoza put it, about anything at all, on a blog dedicated to the author of a philosophy which stresses man’s ability to reason. There is still a mega post to be written on this. Hayek should also feature strongly. One of these days I will do it, or link to someone who has.