Video Poker Payout Calculator Obsession Or, how I got from several years to seven seconds. I have spent much of the past few months obsessing over the payouts of two video poker machines: Deuces Wild and Jacks or Better. The following tables were generated by my Video Poker Payout Calculator.
Full Pay Deuces Wild (FPDW) Perfect Play Payout
With an EV of 1.0076, Full Pay Deuces Wild on average pays back 100.76% of what you put in, as long as you make all the right discard decisions. That's an average, of course  your mileage may vary, especially in shorter sessions. Let's say you are a casino game designer and you're looking to model a game that's more in the casino's favor, say something with a 99.99% payout. You might try ratcheting down some of the payout awards to see what happens. Or maybe you were in the casino and came across such a machine, curious to know the new payout. It's just a few numbers, doesn't look that much work, does it? The tricky one is that middle column... So I set out to write a program that could quickly calculate the payout of either machine, allowing for arbitrary changes to the payout table. At first, this seemed like a tall task. There are 52x51x50x49x48 possible ways to get your first five cards in draw poker (52card deck, so, 52 possibilities for the 1st card, 51 for the 2nd, etc). That's almost 312 million equally probable hands. Then, each hand has 32 (=2^{5}) possible discards that each have to be evaluated for expected value. In order to pick the discard with the best promise in each case, all 32 discards must be considered individually. The discard with the best expected value is picked as the optimal discard strategy for that hand, and is then used in figuring the total payout. The "expected value" (EV  this is statisticianspeak for average
payout) depends on the number of final outcomes that deliver each
payout, ie it depends on how often each outcome is reached or in what
ratio*. Each discard scenario and all of its possible draw outcomes
must be examined.
Some of these discard scenarios are tougher than others. For example, in the case where you hold every card there's only one outcome to evaluate. If you get rid of all five cards there are 47x46x45x44x43 (184 million) ways to replace the discards. Adding up all the draws from all 32 discards, there are 206,471,236 paths from a particular fivecard hand to a final set of five cards. Each of these must be considered for potential payout. But remember, there are 312 million initial fivecard hands. That
makes Fortunately, there are a great number of redundant calculations in this strategy. First, the order of the cards does not matter. This reduces the true number of prediscard hands by a factor of 5x4x3x2x1 = 120. The order does not matter in the final draw either. That's why the "total draws" column in the table actually maxes out around 1.5 million instead of the 184 million I was originally worried about. It turns out there are only 2.6 million deals with 2.6 million draws each, for a new combined total of "only" 6.7 trillion outcomes to evaluate. Still too big  I couldn't stop there. Well, for a while I did, but then I had one of those lightbulboverthehead experiences and I rejoined the hunt. MUSICAL SUITS The suits don't matter! Okay, for flushes the suits of the cards do matter a little, but for the most part suits are not nearly as important as the ranks. Since none of the payouts require a particular suit (ie royal flush spades pays the same as diamonds), it turns out you can take advantage of this. Simply put, whether you are dealt 3H6H4H8S7C or 3C6C4C8D7H, the math that dictates which cards to keep remains the same. That's because if the suits are rotated around like musical chairs (all hearts turn to clubs, all clubs turn to diamonds, and so on) and the ranks stay put, the outcome distributions remain the same. For example, in hands with three different suits represented in a 3:1:1 ratio, this can result in twelve identical results. For those hands I was able to do the computation once and then substitute those results the other 11 times. This reduces the number of initial deals to consider down to 331,682 from 2.6 million. Not bad. Then, instead of bruteforce counting the number of this or that payout among all the draw outcomes, I came up with efficient algorithms for determining the exact probability of each payout, discard by discard. Strict tests of the algorithms were established, spotchecking them for accuracy against the slower "bruteforce" counts. This turned out to be a great application of the probability theory course I took in college, but ultimately was time I could have better spent at the beach. As I finetuned my algorithms with the aid of a code profiling tool (let me just say that Borland's OptimizeIt Suite kicks major butt) and a lot of sneaky shortcuts (like using threads to divide the work between the two CPUs), the total running time to payout calculation was reduced to about five minutes. While this was a huge improvement over the brute force counts (which took several days), and way better than the original estimate of several years, I still felt it was unacceptable. On the plus side, while still too slow for analyzing the entire data set, the algorithms are lightningfast at picking the best discard for individual hands. This came in handy when I put together my Video Poker Trainer. The "draw" button lights up when you have chosen the best discard. The trainer does its analysis using my algorithmdriven Discard Wizard. REDUNDANCY HUNT Still convinced that the payout calculator could be faster, it occurred to me that I could precalculate much of the data. For sufficiently hairy computations such as these, it's a lot quicker to read a large number or result set from a file/database than it is to actually compute it. I had thought of this early on but until now the amount of data was way too large for even my 70gig drive. Now that I had eliminated a lot of the redundancy, the dataset of outcomes had finally been reduced to a manageable size. Roughly 300 megs of data per game were produced, representing the probability ratios of all payouts among all 32 discards in all 331,682 distinct deals. For the first time I could run comparisons on the dataset as a whole, instead of considering just one part at a time in the relatively small space of active memory. I combed the database of outcomes for any remaining redundancy. My analysis revealed that there are only about 127,000 unique discard "fingerprints" (payout probability counts across all 32 discards) for Jacks or Better and just over 100,000 for Deuces Wild. As before, these were involved computations wherein the data could now be read in and processed just once, and then the results could be substituted when it came time to do that calculation for any matching hands. So, the number of significant deals was cut again  this time by a factor of three. A table of all significant deal fingerprints was saved to a file for each game. Each fingerprint has around 350 payout counts (32 discards, 1012 payout counts each). Once compressed, each game produced a file about 11 megs in size, which I call a strategy file. When the final program gets fed a new paytable, the appropriate game strategy file is processed to determine perfect play in all scenarios. As the strategy unfolds, the new total payout is broken out and calculated. New running time (once I optimized all those slow Strings out of the Java I/O code): seven seconds. This I can live with. Of course, total payout measured in EV is only one factor in determining how you'll do in the casino. Variance and bankroll can play a big part too... (more to come). Here are the 12 ways to mix the suits in a five card hand when three suits are present in a 3:1:1 ratio (note that order doesn't matter but relative suit ratio does): HHHSD HHHSC HHHDC CCCHS CCCHD CCCSD DDDHC DDDHS DDDSC For fivecard hands there are only six suit ratio profiles: For example,
in the AABCD case (2:1:1:1) you must divide P(4,4) (ways to assign four suits to four letters, 24) by A breakdown of the number of calculations saved:
