Red Taylor: Kelly Criterion FAQ

FAQ: Kelly Criterion, Utility Functions, and Certainty Equivalent

By Red Taylor. Note the copyright notice at the bottom.

Q0: What questions does this “Kelly Criterion, Utility Functions, and Certainty Equivalent” FAQ answer?

A0: Q0: What questions does this “Kelly Criterion, Utility Functions, and Certainty Equivalent” FAQ answer?
Q1: What is wrong with maximizing your expected winnings?
Q2: What is the “Kelly Criterion” and what are “Utility Functions”?
Q3: What is “Certainty Equivalent”?
Q4: How can certainty equivalents be used in a practical setting?
Q5: What is the optimal bet size?
Q6: Which blackjack rules and camouflage are best?
Q7: How does taxation effect proper bet size?
Q8: How big is my bankroll?
Q9: What is a good Kelly Number for me?
Q10: Where can I learn more about utility theory?
Q11: How can I contribute to this FAQ?

Q1: What is wrong with maximizing your expected winnings?

A1: Although at first glance it seems obvious that it is best to maximize the expected (i.e., predicted average) amount of your winnings, this is in fact not true for most people. If this were really your goal then whenever you had the slightest advantage you would mortgage your house, car, and boat and bet your entire fortune. Although this gives you the greatest net win, on average, this is entirely too risky for most people.

Q2: What is the “Kelly Criterion” and what are “Utility Functions”?

A2: The Kelly utility function and other utility functions give a economically justified and mathematically precise way to compute optimal bets that leads to large winnings but limits the total amount of risk. The Kelly criterion dictates that you should try to maximize the expected logarithm of your total bankroll rather than trying to maximize the expected bankroll itself. In other words, you should try to maximize the exponential rate of bankroll growth.

Many systems tell you to bet an amount that is proportional to your bankroll but the Kelly criterion is best among these in that as the number of bets you make increases, the chance that Kelly betting will beat these other systems approaches 100%. Nonetheless, the Kelly utility function is considered too risky by many professionals and another utility function is used instead.

The function log(x) is called the utility function of the Kelly criterion. It corresponds to a “Kelly Number” of 1. Another utility function is x^(1-1/k) / (1-1/k) for k = 0.3. It is the utility function that corresponds to a Kelly Number of 0.3. Either of these utility functions is useful in evaluating how much a particular proposition is worth to you. This worth is called the “Certainty Equivalent” and it provides a way to compare different bets and betting strategies.

Q3: What is “Certainty Equivalent”?

A3: Would you rather make a bet of $200 on a coin flip with an average profit of $20 or accept $5 risk-free? Would $10 risk-free persuade you not to make the bet? How about $15? Your “certainty equivalent” (or risk-free equivalent) is the amount that participation in the bet is worth to you. — perhaps $5, $10, or $15 in this example.

The Kelly criterion with Kelly number 0.3 advises you to maximize the expected value of u(x) = x^(1-1/k) / (1-1/k), where k = 0.3 and x is your resulting bankroll. If your bankroll is $10,000 then the $200 bet gives an average value of u(x) of

55% * u(10200) + 45% * u(9800) = some number

If instead you were offered an amount “CE” risk-free the average value of u(x) would be

100% * u(10000 + CE) = some other number

These two expressions are equal when CE = $13.38. This is the “certainty equivalent” of the above bet for you if you are a Kelly better with the Kelly Number 0.3 and with a $10,000 bankroll. This amount, $13.38, is how much participation in the bet is worth to you. In particular, if the CE for this bet were negative the bet would be worth a negative amount to you and you should avoid it if possible.

Q4: How can certainty equivalents be used in a practical setting?

A4: The need for the use of logarithms and exponentiation makes the calculations quite difficult when analyzing a complex game such as blackjack. A formula for approximating the certainty equivalent (that is very accurate when your advantage or disadvantage is 10% or less) is

CE = E – V/2kB

where CE is the certainty equivalent, E is the expected winnings, V is the variance of those winnings (i.e. the square of the standard deviation), B is your bankroll and k is your Kelly Number, a measure of the amount of risk you wish to take. The Kelly criterion corresponds to k = 1.0 and in this situation this formula closely approximates calculations based upon the log(x) utility function. When k is not 1, the utility function that you are approximating is x^(1-1/k) / (1-1/k).

For the $200 coin flip above which has E = $20 and V = $$39600 (the standard deviation is $198.997) the formula gives a CE = $13.40 which is quite close to the exact value of $13.38 derived above.

Q5: What is the optimal bet size?

A5: CE provides a way to compare different bets. The bet with the highest CE is the one you want to make, unless all bets have a negative CE in which case you should not bet at all. For a typical hand of blackjack, the amount you expect to win, on average, is some advantage “a” times the amount bet with a variance equal to some value “u” times the square of the amount bet. To maximize the CE you must maximize

CE = ba – (b^2)u / 2kB

where “b” is the size of the bet. A very little calculus shows that this is maximized when

b = akB / u

if “a” is positive. (If “a” is not positive you should bet as little as possible.) A quick calculation shows that your CE will be negative if you bet more than twice your optimal bet. It is better not to bet than to bet more than twice the optimal amount. Also notice that for an optimal bet the CE is precisely half the expected winnings, E. This will be discussed in more detail below.

In a typical blackjack game playing one spot from a shoe under good Las Vegas strip rules, “u” is about 5/4 and “a” is about 0.6% * (t – 0.4) where “t” is the hi-lo true count. Thus your optimal bet (for k = 0.3) is

b = 0.144% B * (t – 0.4)

The value 0.144% of B or, roughly, B/700 is called your unit. Optimally, you should bet no units when t = 0.4, bet one unit when t = 1.4, bet two units when t = 2.4, and so on.

Q6: Which blackjack rules and camouflage are best?

A6: You cannot always bet the optimal bet. Sometimes the count and the advantage are negative. Other times you bet sub-optimally as camouflage or by mistake. Furthermore, the rules and table limits vary by casino and by table. If you can program your simulator to mimic your actions and mistakes as well as the rules of the table you can still use utility theory. Run a million or more simulated shoes. Compute the average win, E, for a shoe and the variance, V, of a shoe. With k = 0.3 and B = your bankroll, this gives you the certainty equivalent of a shoe.

CE = E – V/2kB

You should choose those rules and betting style that maximizes the CE per shoe while still providing the amount of camouflage you desire! By the way, if your strategy varies from shoe to shoe, then there is no reason to limit yourself to simulating shoes. If you like you can simulate a million “weekends.” Choose the overall strategy that gives the best CE per weekend subject to your constraints.

If you calculate that the CE is less than half the expected winnings, E, this is an indication that your variance is too high, you are taking too much risk, and you should consider betting smaller amounts. On the other hand, if the CE is greater than E/2 your variance is too low, you are not taking enough risk, and you should consider betting more. To adjust your betting unit appropriately, look at the ratio

r = kBE / V.

Optimally, you should multiply your betting unit by this ratio r.

Q7: How does taxation effect proper bet size?

A7: You should be sure to include the effects of taxation on your CE for they may be quite significant. Both your average win and your variance are reduced when your government reduces the magnitude of your results via income taxes due on wins and income tax refunds on losses. If you will be able to deduct net losses for income tax purposes (which is not necessarily possible in the United States, even for professionals) then the effect is to multiply your expected winnings E by (1-t) and your variance V by (1-t)^2 where “t” is your marginal tax rate. The net effect is to raise your optimal betting unit by a factor of 1/(1-t). Intuitively, a fraction “t” of every bet you make is not a bet by you, but by your government, so you must increase your betting unit until “your part” of the bet is optimal for you.

If your government does not allow income tax refunds on net losses, be sure to include that fact in your calculations! You will need to use your simulator, or approximations involving integrals of portions of a Gaussian bell curve to do this calculation. You must determine what the distribution of outcomes (i.e. profit) is for a calendar year’s worth of card counting. Instead of averaging these raw values and finding their variance immediately, first subtract from all the positive outcomes the effect of taxation. Then compute the average and variance of all the values (including all the negative outcomes and adjusted positive outcomes) and plug the result into the formula for CE.

Q8: How big is my bankroll?

A8: The answer to this question is useful outside of blackjack. Your bankroll is your bankroll for all endeavors including stock market investments, house purchases, …, and blackjack. It should *not* be divided up among your various investments. The full value should be used in calculations for any potential investment. If this bankroll size seems too large for you, see the answer to the questions “What is a good Kelly Number for me?”

According to utility theory purists your bankroll is your surplus assets plus the present value of your net income after reasonable living expenses. Your surplus assets are all your assets except those you need to maintain a sufficiently comfortable lifestyle. The present value of your net income after living expenses is the answer to the question: How much additional debt are you able to take on while still maintaining a sufficiently comfortable lifestyle?

Some Notes:

Be careful not to overcount any assets or liabilities. If you have $10,000 in a savings account you should either add that to your bankroll because it is an asset or include the interest it earns as net income that could be used to pay off additional debt, but not both! If you account for the $10,000 in one of these two ways, do not also include an additional $10,000 as an amount of debt you could pay off by draining your savings account. As another example, if you have a mortgage either subtract the amount of principal from your assets or include your mortgage payments as a living expense, but not both.

If you own a house which you consider necessary for a comfortable lifestyle and have a mortgage you must include your mortgage as a subtraction from assets or income even though you are not including your house as a surplus asset.

If your salary is *less* than your living expenses then you should *subtract* from your bankroll the amount on which you would need to be earning interest in order to maintain a sufficiently comfortable lifestyle.

Q9: What is a good Kelly Number for me?

A9: With a large Kelly Number “k” you will have a smaller chance of coming out ahead but if you do you will win bigger amounts. With a small Kelly Number you will have a greater chance of coming out ahead but the amount you win will be smaller on average.

Suppose you make optimal small bets (that is, with advantage less than 10%) and keep playing until you lose half your bankroll or double your bankroll, whichever comes first. In the following chart, “Failure” is given by 1/(1+2^(2/k – 1)) and is the chance that you will end up with only half your original bankroll if you play until you either halve or double your bankroll, whichever comes first.

Fraction of KellyFailureCE doubling time

In the above chart, the CE doubling time is given by 1863/k and is the number of shoes you would have to play to earn a CE equal to your bankroll. These numbers here are for a particular strategy I have used for shoe games and are fairly conservative. You may be able to get there in a tenth the time if you play single deck games with a wide bet spread and little betting camouflage. Nonetheless, these numbers are useful for comparing the various Kelly Numbers to each other. Because few are willing to risk a significant chance of losing half of their bankroll, many chose a Kelly Number significantly less than 1.0.

If you are considering using a Kelly Number of 1.0 but changing your bankroll so that it is less than your bankroll as computed in FAQ Answer 8, consider the following alternative. If you want to lower your risk, utility theory purists recommend that you use the bankroll computed in FAQ Answer 8 but change your Kelly Number to something smaller than 1.0. Choose the Kelly Number so that the product of your bankroll and the Kelly Number comes out the same as it would have had you used the reduced bankroll you were tempted to use with a Kelly Number of 1.0. This is more in-line with utility theory and will present fewer headaches in the long run.

The professionals I know aim for k = 0.3.

Q10: Where can I learn more about utility theory?

A10: Almost all of the above comes to me by way of my discussions and computations with other professional blackjack card counters. The original paper by Kelly (listed below) describes Kelly Number k = 1.0 theoretically. I have also heard that the article:

Paul Samuelson, “The ‘Fallacy’ of Maximizing the Geometric Mean in Long Sequences of Investing or Gambling”, in Proc. Nat. Acad. Sci. (1971)

discusses utility theory for k not equal to 1.

The following is adapted from a 1991 rec.gambling post:

I would like to start an educational thread on rec.gambling to answer the question, “What’s the `optimal’ wager in a given favorable situation?”

To start off, in this article I supply a bibliography of Kelly criterion related literature. I highly recommend Michael Dalton’s “Blackjack: A Professional Reference” as a means of doing just this sort of literature search, though it has a lot of room for improvement. It also supplies a basic blackjack dictionary; unfortunately, in this case the definition of Kelly criterion is too simplistic for our needs.

Editor note: The Kelly Criterion definition has been improved in this book.  Check it out here.

I don’t have very many of the references below, unfortunately. But maybe *you* have some of them and can post a summary of the optimal wagering information…

Following each reference, I have listed the source of the reference in brackets. If the source is me I list it as: {Me}

Dalton, Michael, “Blackjack: A Professional Reference”, Spur of the Moment Publishing, 1991. – Simplistic definition of Kelly criterion, plus several references. {Me}

Epstein, R.A., Theory of Gambling and Statistical Logic. New York: Academic Press, rev. 1977… There is a wealth of… gambling and probabilistic information, with a lengthy section on the problem of optimal wagering. {Theory of Blackjack}

Friedman, Joel H., “Understanding and Applying the Kelly Criterion”. Report to the 5th National Conference on Gambling and Risk Taking, sponsored by the Dept. of Economics of the Univ. of Nevada at Reno; 1981. – Deals with the question of simultaneous wagering. {Blackjack: A Professional Reference}

Friedman, Joel H., “Kelly vs Mini-Max”, [Magazine???] Vol.24,10, 1983. {Blackjack: A Professional Reference}

Griffin, Peter, A., “Optimal Wagers on Simultaneous Bets”, Casino & Sports; 1982, Vol. 18, p. 53-56. {Blackjack: A Professional Reference}

Griffin, Peter A., “A Fractured Fable”, [Casino & Sports?], Vol.22, 6, 1983. {Blackjack: A Professional Reference}

Griffin, Peter A., “Theory of Blackjack”, Huntington Press, 1988, chapter 9 p. 131-132 and its appendices A, B, and C p. 139-142, also chapter 14 p. 236. – This is the source of most of whatever I know about optimal wagering. This is a fine book – a must for all math weenie blackjack players. Available from Huntington Press and {Me}

Griffin, Peter A., “Gambling Ramblings”, Huntington Press, 1991, p. 85-86, 90, 119-121. – Old Casino & Sports articles that have been made into a book. Some interesting stuff regarding simultaneous wagers and discontinuities in the probability of being at an all time high with Kelly betting. {Me}

Humble, Lance, & Cooper, Carl, “World’s Greatest Blackjack Book”, Doubleday & Company (Bantam), New York; 1980,1987, page 203. {Blackjack: A Professional Reference} Note: overly simplistic Kelly criterion definition here. Don’t bother looking it up. {Me}

Kelly, J.L., “A New Interpretation of Information Rate”. IRE Transactions on Information Theory, Vol. IT-2, No. 3, Sept 1956. Bell System Technical Journal Vol. 35, 1956, pp 917-926. – Explains the Kelly criterion. Since this report may be difficult to find, the Kelly criterion is fully explained in Allan Wilson’s “Casino Gambler’s Guide”. {Blackjack: A Professional Reference}

Thorp, E.O., Beat the Dealer: A winning Strategy for the Game of Twenty-One, Random House, New York, 1962; Vintage paperback also, 1966; revised edition, 1966. – {The Mathematics of Gambling}

Thorp, E.O., “Optimal Gambling Systems for Favorable Games”. Review of the International Statistics Institute, Vol. 37:3, 1969. – This contains a good discussion of the gambler’s ruin problem, as well as an analysis of several casino games from this standpoint. {Theory of Blackjack}

Thorp, E.O., “Portfolio Choice and the Kelly Criterion,” Proceedings of the 1971 Business and Economics Section of the American Statistical Association 1972, 215-224… reprinted in Investment Decision-Making, edited by J. Bicksler. Reprinted in Stochastic Optimization Models in Finance, Academic Press, edited by W. T. Ziemba, S. L. Burmelle, and R. G. Vickson, 1875, pp. 599-620. {The Mathematics of Gambling}

Thorp, E.O., “The Capital Growth Model: An Empirical Investigation,” (with James Bicksler), Journal of Financial and Quantitative Analysis, March 1973, Vol. III, No. 2, pp. 273-287. {The Mathematics of Gambling}

Thorp, E.O., “The Mathematics of Gambling”, Gambling Times, 1984, p 125-130. – Chapter on optimal betting and its application to blackjack and physical prediction roulette. {Me}

Wilson, Allan. “The Casino Gambler’s Guide”. Harper & Row, New York; 1965, 1970. – A classic. Recommended. Includes Wilson’s point count strategy. Recommended reading for those interested in early blackjack strategy developments, however, his point count is not recommended. The first knowledgeable treatment of gambler’s ruin, the Kelly criterion, and betting progressions and strategies. {Blackjack: A Professional Reference}

Wong, “Blackjack World” Vol.3,162, October, 1981. {Blackjack: A Professional Reference}

???, “Rouge et Noir” 8, 1981. {Blackjack: A Professional Reference}

Q11: How can I contribute to this FAQ?

A11: Send your comments or contribution to: RedTaylor

Copyright (c) 1996 Red Taylor. Redistribution is permitted so long as (1)
no charge is made for this material and (2) this notice accompanies this
material. All other rights reserved.