Thursday, June 29, 2017

Gambler's Ruin and Bet Sizing

A topic of abiding interest.
From ZeroHedge, June 25:

"Big Swinging Dick" Defined
"Big Swinging Dick: (Very) informal and somewhat derogatory; a trader who believes his methodology is perfect and will always result in sizable profits. However, it originally was a term of self-designation for major bond-traders. The term was popularized by the book Liar's Poker, which describes the author's experience as a bond trader on Wall Street in the 1980s." Source: financial dictionary
While much has been written about trade size, meaning how much trading capital should be risked on each trade, this remains a nebulous imprecise topic.
That’s unfortunate because it can make all the difference between staying in the game or losing all trading capital – known in the business as “blowing up”. Yes, even (especially) if you are a big swinger.

The concept of a "trading edge" is central here. Broadly speaking, this is how much a trader is expected to make over a reasonable number of trades (meaning, with some statistical significance), taking into consideration the risk of loss and how much will be made or lost with each trade on average.

Basically, if you don’t know what your edge is you should not be trading, or gambling for that matter. Buying and holding for some time perhaps, but not regularly going in and out of the market without a good plan.

Let’s assume that you are in a situation where you have a negative edge, by definition meaning you are likely to lose money over a large enough number of trades. The conventional example is a casino, where as we know the house always wins in the end because of that reason. So as you walk through the entrance, how large should your bet size be?
Surprisingly to some, the answer is actually 100% of your capital, meaning all of it, in one go – more so if the edge is especially unfavorable to you.

Casinos want to keep their patrons gambling for as long as possible, because they know the longer they keep on playing the greater the chances they will give back all their winnings and then some. Therefore, if the goal is to maximize expected profits go for all or nothing, then walk away (unless you are there to have a good time, in which case bet as little as possible each time).

The other situation where 100% is appropriate is at the other side of the spectrum, when a trader just can’t lose. The more capital traded each time the higher the profits will be since there are no losses. This of course is unrealistic in any speculative endeavor. It’s mentioned here for illustrative purposes only.

The much trickier question lies in between those two scenarios, when there is a positive edge but no certainty of profit. What to do in this situation?

Let's assume there is a 99% chance of doubling your money versus 1% of losing it all, which are fantastic odds. Since we are near certainty, should we bet everything once again?
A big swinger surely would. However, unlikely as it may be we can still suffer a loss, which would instantly put us out of the game. The economic loss is actually much greater than the trading capital in this case, because had we managed to hold on eventually we could stand to make a lot of money with those fantastic odds. But as fate would have it, we blew up. Big swinger no more.
So no matter how good the odds with uncertainty there is nothing preventing a string of losses from occurring (that’s how life works by the way).

The key point here is endurance, meaning having enough trading capital – and mental stamina – on reserve so that if one of those mean streaks hits you can survive and eventually get back to the higher probabilities of winning again.

As such, defining how much should be risked on each trade requires a quantifiable (even if not exact) framework that takes all these factors into consideration. Fortunately, this can be done without too many mathematical gymnastics. Here’s one way to do it.

Let’s assume we trade a fixed amount of our capital, called “R”. That's how much we are risking each time. As a result, trade profits and losses can now be expressed as a function of R, such as a loss of -2R and a gain of 1R. This way the framework can be standardized and applied to any futures contract, stock, FOREX or bond.

Next we assign some probability of occurrence to each resulting R level (usually based on historical analysis or backtesting). Now we can quantify the edge, which is roughly speaking the sumproduct of the two.

The graph below shows a hypothetical trading model that delivers a 0.2R on each average trade. That’s a very decent trading system. Not only are the odds of not losing money higher than losing it, wins are skewed heavily to the upside.
Any trader would love to use such a system any time of the year. However, even if on paper the results look great, it is worth remembering that these are based on statistical averages. This can make all the difference in the world, potentially generating some pretty big swings in your accumulated trading capital over time.

To illustrate how, let’s also assume that we trade this system once every week for ten years, so 52 x 10 trades in total. We define cumulative loss as the maximum loss in terms of R from a trading capital high to a trading capital low over the course of those ten years. This is actually a very important number since a lot of things can happen during that time frame.

As a side note, we are putting on the trades sequentially in a portfolio of one security only. This avoids the complication of figuring out correlation coefficients between securities, where any correlation less than perfect would yield some diversification benefit, in principle reducing the cumulative loss. So we are using the most conservative scenario here.

Finally, Microsoft Excel is used to generate 520 random trades over 1000 times to derive the average cumulative loss. This is like 1000 traders using the same exact system completely independently, and that high number carries some statistical robustness. The resulting histogram is shown below:
We can immediately observe how luck plays such a prominent role in trading (or any activity that involves risk for that matter), even when the edge of the system is quite positive: traders using exactly the same system can experience cumulative losses ranging from 5R all the way to 43R. Of course these are extremes (or tails) with limited occurrences but it does highlight the point.
There is some concentration of outcomes around 12R, the cumulative loss with the most occurrences. But the chance of actually losing more than, say, 19R at some point is over one in five so the tail risk is clearly not insignificant. And it’s also a function of trading over 10 years using this particular system. If it had been, say, just two years (104 sequential trades) that figure would have been 2%, virtually one tenth of that.

In other words, the more you trade the higher the odds you will have a big losing streak at some point, unlikely as that may be. As professional traders often remind us, the worst drawdown is always ahead of us. We can see why here.

That’s a hugely underappreciated point that is often lacking in more meaningful discussions about trade size. And it makes sense. The more you walk under the rain the higher the likelihood you will get wet - even if you have a great umbrella.

If you are planning to trade sporadically and opportunistically, that’s one thing; but if you are considering regularly going in and out of the market over many years that's quite another. Again, it bears repeating that the chances of something bad happening increase with the number of trades.
The size of the edge also makes a huge difference to the outcome. Let’s assume that the edge in the model above is now reduced from 0.2R to just 0.05R (by taking out five percentage points from the odds of getting 2R and allocating them to -1R). The chance of losing more than 19R over ten years is now a staggering 73%.

So the worse the edge the greater the chances of having a string of losses. As it should be of course.

Why are such tail numbers relevant? Because if there isn’t enough capital set aside you will blow up your trading account. Pure and simple.

The higher that tail risk the smaller the trade size should be so you have enough cash stashed away for that rainy day. And this should have implications on how trades are structured.

Moreover, market conditions can and do change, possibly even invalidating the trading system and the edge altogether. In times of great volatility for instance, where swings can frequently occur in both directions, it pays to play even safer than suggested by these figures....MORE
Some of our previous posts on the subject:
March 2008
Markets, Risk and Gambler's Ruin

June 2011
Dreamtime Finance (and the Kelly Criterion)
I've been meaning to write about Kelly for a couple years and keep forgetting. Today I forget no more.
In probability theory the Kelly Criterion is a bet sizing technique used when the player has a quantifiable edge.
(When there is no edge the optimal bet size is $0.00)

The criterion will deliver the fastest growth rate balanced by reduced risk of ruin.
You can grow your pile faster but you increase the risk of ending up broke should you, for example bet 100% of your net worth in a situation where you have anything less than a 100% chance of winning.

The criterion says bet roughly your advantage as a percentage of your current bankroll divided by the variance of the game/market/sports book etc..
Variance is the standard deviation of the game squared. In blackjack the s.d. is 1.15 so the square is 1.3225.

As blackjack is played in the U.S. the most a card counter can hope for is a 1/2% to 1% average advantage with much of that average accruing from the fact that you can get up from a negative table.
Divide by 1.3225 and you've got your bet size.

It's a tough way to grind out a living but hopefully this exercise will stop you from pulling a Leeson, betting all of Barings money and destroying the 233 year old bank.

I'll be back with more later this week.In the meantime here's a UWash paper with the formulas for equities investment....

August 2015
"The High Stakes History of Card Counting (And Its Uncertain Future)"
One of the rules of life:

NEVER EVER play a negative expectation game unless forced.
More after the jump....

What Proportion of Your Bankroll Should You Bet? "A New Interpretation of Information Rate"
How did Ed Thorp Win in Blackjack and the Stock Market?
Journal of Investment Consulting: Interview With Edward O. Thorp
Markets, Risk and Gambler's Ruin
"Not in my house: how Vegas casinos wage a war on cheating"

Finally, another rule of life:

Cassandra's (Not so) Golden Rules About Investing (And Not Investing)
#21. NEVER double-down (except when you have material non-public information and deep pockets) or if you're Ed Thorp, or if you're playing at The Martingale Room. 

Don't double down, double up.